Will we see emotionally intelligent 2016 machines? Andrew Moore, Rector of Carnegie Mellon and former vice president of Google, told ZDNet something fascinating. Moore predicts that 2016 will see a rapid growth in research into emotional understanding by machines.
Robots, smart phones, computers will begin very quickly to understand how we feel and will be able to respond accordingly.
"There will be no immediate uses," he explains.
"OR monitoring but a patient with an emotionally intelligent computer will allow doctors to understand the discomfort when the patient is unable to communicate. Measuring student engagement by looking at their emotional response will help teachers be more effective. These are clear wins.”
But, Moore warns, switching to gadgets with emotional comprehension can be abnormal.
Imagine this valuable information falling into the hands of advertisers…. Technology must proceed with scrutiny and critique.
People give out several distinct clues about how they feel, both consciously and unconsciously, and research in this field has undertaken several parallel paths.
Voice models can reveal stress and excitement and the movement of facial muscles provides a revealing map of a person's inner state.
"Cameras are now of higher resolution. High-resolution cameras can monitor small facial movements or even individual hairs. ”
Robotics and IT scientists have applied its developments artificial insight into existing research data from the field of psychology on emotional intelligence.
Currently, most emotionally intelligent machines collect sentiment types from language body and facial movements. But there is also the tantalizing possibility that machine learning could be used to enable machines to understand even better strategies for interpreting emotions. Ultimately this could lead to devices that are more emotionally perceptive than humans themselves.
.....