Emotional Intelligence has recently become a much-talked of concept despite the fact that it was first introduced some 20 years ago – in 1995. This special ability of a person to recognize and handle one’s feelings and emotions was studied by Dr. Daniel Goleman, who actually brought public’s attention to this particular cognitive ability area.
Since that time there have been carried out a great number of various studies in the field and the importance of EI was actually proposed to be placed higher than one of IQ level (in any interpersonal activity, and especially in business).
With the advent of technologies (that were unthinkable in 1995) emotional intelligence research has stepped onto a new level. We’ve learned everything about the concept as it is and how to implement the knowledge consciously on at least a personal basis, and we are already trying to dive deeper into emotional analytics with the range of tools we have recently acquired and are gradually updating.
The talk is about the plethora of APIs and SDKs that are able to interpret a person’s (a user’s) mood by means of tracking human emotive gestures. On the basis of reactions to visual, audio and kinaesthetic stimuli and distractions recent technologies are able to carry out the sorting of emotions into 7 major groups:
- and disgust.
Methods used: facial detection, sonic algorithms and semantic analysis; tools are photo and video materials, text and speech samples.
Such a blend of psychology and technology provides a chance for a very specific analysis – with the conclusions that are all the more important, the larger the scale is.
Why does It Matter?
Because it is profitable. For any business, for every business literally.
Getting back to EI as a basis – on a personal level in business – it is about leadership and organization with the use of emotional intelligence skills; we talk about human resources management on a different level.
Besides, emotion analytics is very likely to take the place of many analytical marketing tools that are in use now and have been considered effective so far – but not for long. In accordance with the recent news the only pure facial recognition market is likely to grow up to $6.19bln in next five years; and provided that emotional recognition is added to this equation – the number gets unscalable. Well, so do the uses for the method.
Take any group testing in business – for a commercial, or an audience response for a movie or a specific product. This is actually a marketing campaign on entirely new level – you get huge amount of automatically accumulated data that is not only collected in a more efficient way but is also much more reliable as immediate emotional reactions are very hard to control.
Thus, the response of any target audience is likely to be more sincere and all the more exemplary. The implementation, as it has already been said, barely has any limitations: from getting real TV ratings to improving security systems of any public venues (stadiums, airports, etc.)
In the opinion of Paul Zak (the Center for Neuroeconomics Studies at Claremont Graduate University) we are likely to get emotion-optimized products and services very soon; it is not even the question of tomorrow but rather of today. And getting back to the question of profit – every business that directly deals with customers won’t be able to stay unaffected.
One of the most vivid examples is how large and influential businesses have already taken emotion analytics on board. Google and Pepsi along with 20th Century Fox and Jaguar have already applied for such “applied neuroscience platform” (in their own terminology) as Lightwave for parsing people’s biometrics, collecting and analyzing consumers’ emotional and mental states. The company also measured the audience’s response and the intensity of emotional reactions to the famous movie “The Revenant” by using real-time biometric data (with a number of quite far-fetching conclusions). Lightware, as well as a growing number of similar apps and platforms is paving the road to entirely new emotion economy.
Among other APIs that recognize mood is Affectiva: it represents a new solution that allows large scale data collection and analysis – of facial expressions to be precise – with the help of visual analytic tools. At the present moment it owns a database of 3,289,274 analyzed faces.
The company’s CEO and Co-founder Rana el Kaliouby points out that EI factor can no longer be neglected; besides, it can no longer be accounted separately in the era of Internet of things. Digital devices have already become an inseparable part of our life; thus, they are also in need of being adapted in terms of emotional intelligence as well – the result will be all the more beneficial.
The same can be referred to the other two examples that of Synesketch and Moodies.
The former represents text-to-emotion “converter”. It is an iTunes player for written text. Basically it represents an tool that allows analyzing text for emotive reaction that further on is converted into a unique visualization.
The latter deals with translation from speech to the language of emotions. Literally, the way one is speaking, the words used, the tone and speed and other speech-related factors are analyzed; from this information actual feedback about general emotional state and mood of a speaker is given. Besides, the analysis allows making some conclusions with regards to a person’s characteristics in decision-making.
EI and Emotion Recognition Perspective
While the achievements are numerous and impressive the perspective is not clear. Emotional recognition via facial detection (or any other means) is only likely to advance, and with it a number of ethical questions is also likely to raise – especially the ones with regards to the authorization and consent granting for any such analysis. It is especially true with regards to business and advertising sector – where profit runs the show – and misuse can be expected.
Article by: Natallia Tsahelnik