Future Life, Inc.

Micro-Expression Graphing


Your Emotional Footprint

The technology pioneered by Future Life maps the Human Emotional Footprint using Facial Micro-Expressions of Emotion, or FMEE analysis, to objectively measure human emotional responses and actions that are usually undetectable to the human eye. We call this technology Face2Face, and our initial focus is on post-traumatic stress disorder and credibility assessment with deception detections. Preliminary findings suggest that the Face2Face technology can increase the accuracy in identifying mental health conditions over time by more than fifty percent versus current percentages as reported by government clinicians.


Telehealth Technology

We show our emotions in many ways such as our body language and vocal tones, but the most compelling way is through our facial expressions. With strangers our predictions of their thoughts and feelings are accurate roughly 20% of the time, with close intimates, like husbands and wives, the rate goes up to about 35%. The best among us, what at reading each other’s thoughts and feelings rarely reach 60% or better. 1 Our deepest thought and feelings are easily kept secret from one another and often from ourselves.2

There is a constant stream of involuntary non-verbal communication flowing between us on our faces. These expressions of emotion that we see on one another’s faces, in typical interactions, last between half of a second to 4 seconds. They are classified as Macro-expressions, and we know people can control them. For example, we often smile when we’re supposed to, but not because we actually feel like smiling.

However, while we are constantly communicating and assessing each other’s Macro-expressions, another whole layer of communication is occurring. We have Facial “Micro-expressions of Emotion (FMEEs) that last 1/15th of a second or less. These sets of involuntary facial movements and our neutral baseline always express the same 6 human emotions: Sadness, Fear, Anger, Happiness, Surprise, and Disgust. (See figure 1 below).

Micro-Expressions of Emotions
Fig.1

Not only are these micro expressions consistently related to the same underlying feelings, but they are also universal as if they were hardwired into our DNA. As far back as the late 1960s, serious scientific research was showing that the same 6 basic set of facial muscle movements reflect the exact same emotions, no matter where you are on the planet, no matter how advanced or primal your origins. (see fig. 2 of a Fore tribesman in New Guinea showing Joy, Sorrow, Anger, and Disgust).3

Fore Tribesman in New Guinea
Fig. 2

Facial micro-expressions of emotion have been adapted to such diverse uses as marketing or predicting, for example trying to determine the predisposition of jurors, because they reveal the true feelings behind our words even when we’re trying to conceal them. But even with the use of analysis of facial emotions to establish emotional status, credibility, or to predict behavior, we were still relying on the inconsistent skills of the human observer. Hence there was no way to tap the full potential of FMEE’s until Future Life developed the emotional mapping tool, Face2Face.

Face2Face has become a game changer not only in the area of Behavioral Health but other markets as well. The program uses a telehealth platform. It produces a 3-dimensional model of the subject’s face (See fig. 3), compares it to the structure of all 6 universal emotions and the subject’s neutral baseline, then compiles, and presents a readout of FMEEs for every1/30th of a second of video captured. It’s 80 to 85% accurate for each FMEE classified.

With the data from the readout, it produces an “emotional footprint” for subjects’ feelings on any given topic, a data set and graphic representation of how their emotions flow and interact with one another, and a record of which topics or experiences activate or “trigger” their emotions.

Face Data Points
Fig. 3

Face2Face goes far beyond the mere classification of emotions from social media photos. Feelings aren’t static, they’re dynamic and fluid, reflecting both our personalities and how we’ve learned to respond to different situations.5 Face2Face reflects that reality in the way it treats FMEE data: It produces an overview of the individual’s overall emotional status (resilience), their emotional valences toward certain situations (reactivity and/or bias), and which emotional reactions and defenses are unconsciously mobilized when they experience stress (coping styles/personality traits).

Face2Face literally maps the ever-changing human emotional landscape on the basis of hard data. Its assessments of an individual’s resilience, reactivity, and coping style, are driven by comparisons of their responses to different topics compared to their own baseline (neutral) reactions. It also uses comparisons of an individual’s reactions to stimulus pictures with nationally based norms. Face2Face’s integrated neural-network machine learning is designed to continuously improve its accuracy in those areas.

Moreover, it's machine learning is not limited to self-correction. It is designed to compile an ever-expanding database and set of emotional interactions and behaviors it can detect or predict. As its store of data grows it will be able to create algorithms for almost any purpose.