peshkova - Fotolia

Get started Bring yourself up to speed with our introductory content.

Emotion analytics may expose your true feelings to HR

Deep learning systems can analyze involuntary facial reactions invisible to the human eye. From it, HR can try to glean whether you are a good cultural fit.

Emotion analytics software can see right through your poker face. Its algorithms can analyze video and identify involuntary microexpressions that can be as fast as a twenty-fifth of a second and invisible to your eye. The technology can be used to assess job candidates. You're trying to present confidence, but this emotion recognition software is detecting uncertainty.

Does this capability trouble you?

Emotion analytics may have a ready use case in security: to help pick out the threat in a crowd. This technology is also used in marketing research and has potential to help diagnose patients in telemedicine. The algorithms and deep learning technologies that make this analysis possible are also available to HR departments.

There are seven basic emotions: disgust, joy, sadness, surprise, anger, contempt and happiness. Your face may register them in milliseconds. Vendors are analyzing video and, from your response to questions, can create personality profiles. They may be trying to identify someone who will be a good cultural fit in your business.

Early days for emotion recognition in HR

It is still very early days for emotion recognition technology in HR, analysts say. Its potential for harm may keep some away.

This was illustrated recently by two Stanford University researchers, Yilun Wang and Michal Kosinski. In a paper, they showed that face-reading technology could use photographs to determine sexual orientation with a high degree of accuracy. The intent of their work was to make the public "aware of the risks that they might be facing already."

Those risks included the possibility that a government could use emotion analytics in discriminatory ways. But this technology, similar to any other potentially powerful technology, is unlikely to be shelved because of its abuse risk.

One person interested in emotion analytics technology is Luke Fryer, CEO of New York-based Harri, a services firm that provides what it terms a "workforce operating system." This system provides everything needed to manage a workforce, including recruiting, learning management, scheduling, time and attendance, and performance analysis.

Harri uses face recognition in an iPad-based time clock, but the interface is also a two-way communication tool for pass-along announcements and getting worker feedback. For a restaurant worker, for instance, it may show today's specials by picture or video and "allow an employee to instantly provide feedback" about the special, he said.

Fryer said Harri's facial recognition systems could be adapted to also detect mood at the point of clock-out.

Fryer hasn't deployed this capability and is open about his reservations.

Emotion analytics has a Big Brother feel

"There's something very Big Brother about it," Fryer said. Nonetheless, the technology may be useful, he argued. He noted that there is a high degree of inaccuracy on manually collected feedback when someone, for instance, is asked to rate something on a scale of one to five. But he believes that emotion analytics software may more accurately tell how employees really feel.

If an employer learns that 60% of the employees had an "anxious expression" when asked something, "that might tell us something about the underlying feedback we're getting," Fryer said.

Fryer is nonetheless cautious, and any deployment will have to be slow and well-tested. Employee acceptance would really depend on how the face-reading technology is presented to them, he said.

Helen Poitevin, analyst at Gartner, isn't so sure that there is a use case for this technology in measuring employee engagement. The cost of implementation and "the perceived risk that employees will feel spied on" will not be amenable to fast adoption of this technology, she said.

London-based Human is a startup that provides this type of analysis. It analyzes video at 50 frames per second "to ensure that millisecond movements on the face" that the human eye cannot see "are captured," said Yi Xu, CEO and founder.

Software recognizes facial expressions

This includes identifying anger, happiness or surprise, as well as passion, confidence and curiosity, among other characteristics for personality profiling. Human's customers can also use the software for determining customer satisfaction and experience, recruitment and transportation security to try to identify people with suicidal intentions, Yi said.

In HR, a client may use a video interview for an initial assessment. The applicant has a number of questions ranging up to 10. The video records it, and Human's system delivers emotion and characteristic scores.

The clients may ask for "blind scores," meaning they don't know the candidate's gender, race, age -- to eliminate all discrimination and bias. What an employer may want is the "most curious candidate" or the "most passionate candidate," and the report that Human delivers will have that information, Yi said. From that initial video vetting, the in-person interviews take place.

Human works with psychologists and clients, processes millions of data points and trains its deep learning system to pick up on patterns. "There is no right or wrong personality profile," Yi said.

Audio analysis is also something that vendors use, particularly in the screening of call center workers. That use makes sense, Gartner's Poitevin said, since communication over the phone is the main driver of performance.

Adoption slow due to 'creepy' factor

Otherwise, adoption "has been slow due to the 'creepy' factor of evaluating emotion of workers," Poitevin said. There is occasionally interest in these kinds of tools to help detect burnout, suicide risk or other psychosocial issues in the workplace. "But it is hard to put in place due to privacy concerns and a sense that the data collection and invasion of privacy outweighs the benefits individuals or organizations would get from such analysis," she said.

Merritt Maxim, security analyst at Forrester, said the focus in security is on face recognition. Emotion recognition systems may lead to unnecessary extra screening and "accusations against entirely innocent people."

Rob Light, senior research specialist at G2 Crowd, said he sees little adoption, so far, of this technology but also believes it will likely get deployed. If the ability exists, someone will be taking advantage of it, he said.

"I think there is an uneasiness to all of this," Light said.

This was last published in December 2017

Dig Deeper on People analytics software

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

3 comments

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

What purpose do you see for using emotion analytics technology?
Cancel
The detailed or minute details may be better fetched by AI. While this is certainly a good change, this must be co-read with human mind. The neural network of brain and AI are completely different. While one is programmed to deliver, the other one sees that are situational need. Second aspect is the culture aspect. How can we standardise this? Third is the application of this in only "Rule Based" facial expressions and not for all. Fifth, how comfortable will I be if AI is gonna evaluate my facial expression? May be I hated my interviewer. Will it showcase that? I may not prefer. Also there could be a terrible personal disaster someone may be going through. Why should the AI discover that and present it while I am not comfortable. Hence the co-creation with human mind is absolutely essential.
Cancel
Understanding emotions, helping people to choose better. Omitting Kahneman, Thaler and others, I think the best recap is of Maya Angelou: “At the end of the day people won't remember what you said or did, they will remember how you made them feel.”
Cancel

-ADS BY GOOGLE

SearchSAP

SearchOracle

SearchBusinessAnalytics

SearchContentManagement

SearchDataManagement

SearchCRM

Close