Lijun Yin of Binghamton University is attempting to make computers understand facial expressions. Although, this might make computers easier for everyone to use, he especially sees a need for this technology among those who have a limited ability to communicate.
Caption: Binghamton University researcher, Lijun Yin, wants computers to understand inputs from humans that go beyond the traditional keyboard and mouse.
Credit: Jonathan Cohen
Earlier, Yin collaborated with Peter Gerhardstein, also from Binghamton University, to create a 3D facial expression library. That database, made from 2500 facial expressions of 100 different people, is available for free to nonprofit research groups. Since then, Yin has been attempting to teach computers to read those same emotional cues. The challenge is to translate tiny changes around a subject’s eyes or mouth into a language that computers can interpret.
As Yin says:
Computers only understand zeroes and ones. Everything is about patterns. We want to find out how to recognize each emotion using only the most important features.
If successful, Yin anticipates a myriad uses for the new technology, from determining whether patients are in pain to detecting lies. Perhaps one day, our computers will read our moods and start us off with some soothing music before they display our credit card bills.
You can watch Yin explain his project below.
Lag din egen hjemmeside med Norges mest populære hjemmesidetjeneste med over 60.000 aktive hjemmesider.
ReplyDelete