Emotion Recognition Systems– An uncomfortable innovation – Part II

image_url

As of now, it could be more beneficial to consider emotion recognition as a supporting tool rather than an integral solution. Here’s the concluding part

Emotion detection use cases

Despite concerns, emotion recognition technology has already found several genuine use cases.A few of these are as follows:

  • Deepfake Detection:As early as 2019, the Computer Vision Foundation partnered with UC Berkley, Google, and DARPA to come up with a solution that could identify doctored videos by analysingthe emotional expressions of people featured therein. Identification of deepfake video manipulations had assumed special importance in the run-up to the 2020 US Presidential election. It was a time when the nation was deeply concerned about politically motivated misinformation being spread through video content. Indeed, deepfake detection can be one of the foremost positive usage of emotion recognition technology.
  • Early Autism Detection:Expression analysis has proved to be a reliable technological diagnostic tool for autism spectrum disorder (ASD) in children. The system captures and analyses facial emotion as the children interact with a computer screen. It looks for early signs of ASD, and has performed consistently with 99.09% accuracy.Stanford University has leveraged the technology in its Autism Glass Project, whichuses Google’s face-worn computing system. Another similar projecthas used machine learning to develop an app that screens children for autism through a behavioural coding algorithm that runsa subject’s facial reactions to a movie and analyses the nature of their responses. Several other solutions are offering similar solutions and have successfully benefitted ASD sufferers with timely intervention.
  • Automotive Safety Systems:Machine Learning driven emotion/expression recognition technology is being integrated in several in-car safety suites. It can identify whether a driver is not looking at the road, not wearing a mask if that is the mandate, making a hands-on phone call, or if the driver is falling asleep. The system can be further configuredto intervene if a driver is found to be lacking in alertness. It could trigger warnings or other precautionary actions.
  • Market Research Tools:Alfi, a Miami-based AdTech company, has come up with an ML-based facial expression recognition toolthat detects people’s emotions in public places and then delivers targeted, customised advertising based on findings in real-time. The algorithm can accurately assess a person’s age, ethnicity, and mood to show personalized content. Although this innovation is sure to evoke a lot of privacy concerns, it looks like such targeted advertising tools will become mainstream in future.
  • Virtual Recruitment Screening: Facial expression recognition systems are widely becoming a part of HR Tech. It is being used to screen candidates during the interview process to analyse personality traits and determine their employability. This is one use that runs the risk of walking into the grey area of emotion recognition technology – as it can promote a lot of biases inherent in facial recognition technology itself. More about this later.

Commercial emotion detectionproducts

Several companies have started rolling out commercial products based on emotion recognition technology.

  • Microsoft Azure’s Emotion API can return emotion recognition estimates along with the usual array of feature requests.
  • Google Cloud Vision and Amazon Rekognition are two other big names. Both provide facial sentiment detection facilities as part of their respective facial recognition APIs.
  • SkyBiometryis another commercial facial detection and analysis API which can clearly differentiate between anger, disgust, neutral mood, fear, happiness, surprise and sadness.
  • Apple had patented emotion recognition mechanisms way back in 2012 and had acquired the artificial intelligence startup Emotient in 2016. A 2019 patent hadalso prompted industry experts to speculate that Apple’s Siri assistant could use such technologies to identify user emotions via facial expressions – although there has been no formal announcement yet. However, it should be noted that Apple’s Emotient scored the highestin a2020 study of eight commercially available facial expression recognition systems.

Not a distant possibility

Despite its widerange of possibilities,emotion recognition technology still sounds too futuristic, right?

Wrong!

The technology is already very much in use. The European Union is leveraging it to detect illegal border infiltrations. South Korean companies are using it in the recruitment process.The Chinese are using it for education and driving safety. And here, in India, several government establishments are using it to suit various disturbingpurposes.

The Uttar Pradesh police proposed a Safe City Initiative in 2021 that included – among various other features – an AI system that could detect “distressed” women through facial recognition.Sanitation workers in Haryana have been provided wearables equipped with microphones and trackers so that supervisors can monitor and analyse their performance levels.

The Indian market for emotion recognition technologies has grown silently over the last few years. However, fears over possible misuse runs too great. Unfortunately,there are reasons to believe thatthe technology is being perceived by many business leaders as animproved and evolvedsurveillancetool. And therein lies the dangers of emotion recognition systems.

An uncomfortable innovation

Can human emotions berecognisedaccurately enough to encapsulate it into a logical system? Is the way we – humans – assess the moods of other humans,scientific enough to be replicated in machine learning systems?These are questions that have no ready or easy answers. Researchersat the University of Glasgow has observed that there are essential differences between the way people from around the world manifest and perceive emotion. Also, there is evidence that this perception is different in women and men. How can an algorithm take into account such spatial, cultural, and gender variations in emotion detection?

Then comes aneven more disturbing concern. Emotion detectionhas been criticised as being based on pseudoscientific assumptions drawing connections between a person’s external appearance and inner emotional state. It is as good as a legitimisation of long discredited dubious theories of physiognomy and phrenology.

Finally, emotion recognition represents a fundamental shift from biometric systems because rather than objectively identifying or verifying a person, it starts to judgethe person subjectivelybased on what that person is thinking or emoting. This ushers in a new kind of biometric surveillance by which unilateral and consequential assumptions are being made about a subject’s emotional stateandcharacter, with little to no avenue for meaningful accountability. If an emotion recognition system flags someone as a person with “dark traits,” it can be difficult or impossible to prove or disprove this assumption.

The credibility problems seem to be derived more from doubt around the technology’s core assumptions than its machine learning-based implementations. As of now, it could be more beneficial to consider emotion recognitionas a supporting tool rather than an integral solution, and to continue research on multi-factor indicators of mood to create more effective emotion detection algorithms. And of course, robust and thoughtful regulations need to be in place so as to ensure that the burden of action and transparency must be placed on entities in power and not those potentially subject to such surveillant systems

[Concluded]

 

Know more about the syllabus and placement record of our Top Ranked Data Science Course in KolkataData Science course in BangaloreData Science course in Hyderabad, and Data Science course in Chennai.

 

Leave us a Comment