peshkova - Fotolia
Emotion analytics software can see right through your poker face. Its algorithms can analyze video and identify involuntary microexpressions that can be as fast as a twenty-fifth of a second and invisible to your eye. The technology can be used to assess job candidates. You're trying to present confidence, but this emotion recognition software is detecting uncertainty.
Does this capability trouble you?
Emotion analytics may have a ready use case in security: to help pick out the threat in a crowd. This technology is also used in marketing research and has potential to help diagnose patients in telemedicine. The algorithms and deep learning technologies that make this analysis possible are also available to HR departments.
There are seven basic emotions: disgust, joy, sadness, surprise, anger, contempt and happiness. Your face may register them in milliseconds. Vendors are analyzing video and, from your response to questions, can create personality profiles. They may be trying to identify someone who will be a good cultural fit in your business.
Early days for emotion recognition in HR
It is still very early days for emotion recognition technology in HR, analysts say. Its potential for harm may keep some away.
This was illustrated recently by two Stanford University researchers, Yilun Wang and Michal Kosinski. In a paper, they showed that face-reading technology could use photographs to determine sexual orientation with a high degree of accuracy. The intent of their work was to make the public "aware of the risks that they might be facing already."
Those risks included the possibility that a government could use emotion analytics in discriminatory ways. But this technology, similar to any other potentially powerful technology, is unlikely to be shelved because of its abuse risk.
One person interested in emotion analytics technology is Luke Fryer, CEO of New York-based Harri, a services firm that provides what it terms a "workforce operating system." This system provides everything needed to manage a workforce, including recruiting, learning management, scheduling, time and attendance, and performance analysis.
Harri uses face recognition in an iPad-based time clock, but the interface is also a two-way communication tool for pass-along announcements and getting worker feedback. For a restaurant worker, for instance, it may show today's specials by picture or video and "allow an employee to instantly provide feedback" about the special, he said.
Fryer said Harri's facial recognition systems could be adapted to also detect mood at the point of clock-out.
Fryer hasn't deployed this capability and is open about his reservations.
Emotion analytics has a Big Brother feel
"There's something very Big Brother about it," Fryer said. Nonetheless, the technology may be useful, he argued. He noted that there is a high degree of inaccuracy on manually collected feedback when someone, for instance, is asked to rate something on a scale of one to five. But he believes that emotion analytics software may more accurately tell how employees really feel.
If an employer learns that 60% of the employees had an "anxious expression" when asked something, "that might tell us something about the underlying feedback we're getting," Fryer said.
Fryer is nonetheless cautious, and any deployment will have to be slow and well-tested. Employee acceptance would really depend on how the face-reading technology is presented to them, he said.
Helen Poitevin, analyst at Gartner, isn't so sure that there is a use case for this technology in measuring employee engagement. The cost of implementation and "the perceived risk that employees will feel spied on" will not be amenable to fast adoption of this technology, she said.
London-based Human is a startup that provides this type of analysis. It analyzes video at 50 frames per second "to ensure that millisecond movements on the face" that the human eye cannot see "are captured," said Yi Xu, CEO and founder.
Software recognizes facial expressions
This includes identifying anger, happiness or surprise, as well as passion, confidence and curiosity, among other characteristics for personality profiling. Human's customers can also use the software for determining customer satisfaction and experience, recruitment and transportation security to try to identify people with suicidal intentions, Yi said.
In HR, a client may use a video interview for an initial assessment. The applicant has a number of questions ranging up to 10. The video records it, and Human's system delivers emotion and characteristic scores.
The clients may ask for "blind scores," meaning they don't know the candidate's gender, race, age -- to eliminate all discrimination and bias. What an employer may want is the "most curious candidate" or the "most passionate candidate," and the report that Human delivers will have that information, Yi said. From that initial video vetting, the in-person interviews take place.
Human works with psychologists and clients, processes millions of data points and trains its deep learning system to pick up on patterns. "There is no right or wrong personality profile," Yi said.
Audio analysis is also something that vendors use, particularly in the screening of call center workers. That use makes sense, Gartner's Poitevin said, since communication over the phone is the main driver of performance.
Adoption slow due to 'creepy' factor
Otherwise, adoption "has been slow due to the 'creepy' factor of evaluating emotion of workers," Poitevin said. There is occasionally interest in these kinds of tools to help detect burnout, suicide risk or other psychosocial issues in the workplace. "But it is hard to put in place due to privacy concerns and a sense that the data collection and invasion of privacy outweighs the benefits individuals or organizations would get from such analysis," she said.
Merritt Maxim, security analyst at Forrester, said the focus in security is on face recognition. Emotion recognition systems may lead to unnecessary extra screening and "accusations against entirely innocent people."
Rob Light, senior research specialist at G2 Crowd, said he sees little adoption, so far, of this technology but also believes it will likely get deployed. If the ability exists, someone will be taking advantage of it, he said.
"I think there is an uneasiness to all of this," Light said.