Biofeedback replaces surveys and likes
All of the big tech companies are currently investing in research on artificial intelligence or “affective computing”. When machines learn to recognize our feelings, they will be able to read almost all our desires in our eyes. According to GDI researchers customer satisfaction will unquestionably become the measure of all things.
This is an excerpt of the GDI Study “Wellness 2030 – The new techniques of happiness”. Download the study for free.
When machines learn to recognize our feelings, they will be able to read almost all our desires in our eyes and no one will ever have to ask us how we are again. In October 2001, Toyota presented a concept car called “The Pod”. The car was able to recognize its driver’s mood based on their reactions and driving style, and displayed this on the car’s exterior using colored LEDs: red for anger, yellow for happiness, blue for sadness. There was also an antenna that moved like a dog’s wagging tail. The Pod was conceived as the "car of the future", but it remained a concept car and never went into production.
Since then, the technology for measuring emotions has made great strides and is now being tested in a number of fields of application. Apps are attempting to trace behavioral patterns and emotions from passively followed smartphone data. It is possible to discern how a person is feeling simply by looking at the way the use their cell phone. Which apps do they use frequently? How often do they text? How long do they speak for and with whom? How many steps do they take and how long do they sleep? This panoply of smartphone data offers up precise indications of a user’s mental and physical health. “Time Well Spent”, for example, is an app that helps people monitor their screen time. It investigats how the screen time a person accumulates on particular apps affects their mood. The results show that more than 40 minutes of screen time per day has a negative impact on our well-being, regardless of which apps we use.
In the near future, an increasing number of the apps we use in our everyday lives could be fitted with biometric sensors and facial and voice recognition software that analyse our emotions. Using upgrades, companies can integrate these functions into the products we are already using today. Apple, for example, has recently bought Emotient, one of the leading companies for facial recognition technology. It is possible that you’ll soon hear Siri volunteering questions like this: “Your facial expression seems sad – should I download ‘Trainwreck’ from iTunes?”
All of the big tech companies are currently investing in research on artificial intelligence or “affective computing” – that is, the development of systems and devices that can read, interpret, process, simulate and predict human emotions. With software like “HowWhoFeelInVideo”, for example, Amazon is analyzing the emotions on faces in sampled video clips. LA startup “Polygram” has developed software that studies facial reactions to shared photos. According to Polygram, the app can “read” facial expressions, assign them emotions such as boredom, happiness and interest, and automatically translate them into emojis. The social media app, which works like a cross between Snapchat and Instagram, sends out the facial expressions of individual followers in real time as they view the photos.
Few innovations have brought biometrics into our everyday lives as quickly and effectively as the 3D facial recognition (Face ID) on Apple’s new iPhone X. Along with further data collected from other sources, such as sensors in clothes, furniture and toilets, innovation in the tech industry is gradually creating an ever-denser emotion recognition network and using it to provide information and feedback on our well-being.
The developers of artificial emotional intelligence (AEI) see it as primarily beneficial, claiming that AEI will help us make better decisions. But it is not only the creators of such technologies who see opportunity in them. There are also economic arguments for emotion tracking: it allows us to measure customer satisfaction in real time and the process is better, cheaper and more simple than in traditional customer survey methods. With the arrival of automatic emotion recognition, customer satisfaction will unquestionably become the measure of all things. Providers of consumer products and services will have to adapt according to this customer satisfaction data.
Read more about the advantages of “Artificial Emotional Intelligence” for education, consumption and self-awareness in the GDI Study “Wellness 2030 – The new techniques of happiness”.