Future Life, Inc.

Micro-Expression Graphing


Your Emotional Footprint

The technology pioneered by Future Life maps the Human Emotional Footprint using Facial Micro-Expressions of Emotion, or FMEE analysis, to objectively measure human emotional responses and actions that are usually undetectable to the human eye. We call this technology Face2Face, and our initial focus is on post-traumatic stress disorder and credibility assessment with deception detections. Our findings demonstrate that the Face2Face technology increases the accuracy in identifying mental health conditions over time by more than fifty percent versus current percentages as reported by government clinicians.

By “Emotional Footprint” we mean an overview of what the person’s predominant emotions are at that point in their life. The Face2Face Emotional Footprint provides a window into an individual’s feelings on any given topic, a data set and graphic representation of how their emotions flow and interact with one another, and a record of which topics or experiences activate or “trigger” their emotions. The Emotional Footprint makes it possible to breakout data to examine how an individual responds to a particular question or set of questions in, say, recruitment or job interview, credibility assessment, combat mission debriefing or a particular stimulus picture or class of images.


Telehealth Technology

We show our emotions in many ways such as our body language and vocal tones, but the most compelling way is through our facial expressions. With strangers our predictions of their thoughts and feelings are accurate roughly 20% of the time, with close intimates, like husbands and wives, the rate goes up to about 35%. The best among us, what at reading each other’s thoughts and feelings rarely reach 60% or better. 1 Our deepest thought and feelings are easily kept secret from one another and often from ourselves.2

There is a constant stream of involuntary non-verbal communication flowing between us on our faces. These expressions of emotion that we see on one another’s faces, in typical interactions, last between half of a second to 4 seconds. They are classified as Macro-expressions, and we know people can control them. For example, we often smile when we’re supposed to, but not because we actually feel like smiling.

Face2Faces uses state-of-the-art machine learning to build an emotional fingerprint of a subject. Our technology shows a real-time analysis of a subject’s emotional state across 7 key emotions. It then correlates these emotions into a correlation matrix that allows a clinician to assess underlying conditions that may not be outwardly apparent. Building from a solid initial product, the 3.0 product roadmap includes numerous features and capabilities to greatly enhance its capabilities. This includes:

  • The addition of more sensors (in addition to video) to obtain a more well-rounded picture of a subject.
  • More powerful reporting, graphing and analysis tools will allow a clinician to gain greater insight into a subject.
  • An increased focus on tele-operation of the product will allow the subject and clinician to be in different locations if the conditions dictate.
  • A cloud / SAAS model ensure the product being used is always up to date. No software to install, and no updates to worry about.
  • A modern, responsive interface will allow the product to operate on a host of devices, from laptops to tablets to phones.
  • Authenticated sessions will allow a clinician to pause and resume sessions, maintain patient notes and refer to prior saved sessions to aid in assessments.
  • Secure, open APIs will allow third party applications to leverage this platform as well as provide FutureLife the ability to rapidly build on top of the platform.

With the data from the readout, it produces an “emotional footprint” for subjects’ feelings on any given topic, a data set and graphic representation of how their emotions flow and interact with one another, and a record of which topics or experiences activate or “trigger” their emotions.

Face Data Points
Fig. 3

Face2Face literally maps the ever-changing human emotional landscape on the basis of hard data. Face2Face has been proven to be accurate 999 times out of 1000. Its assessments of an individual’s resilience, reactivity, and coping style, are driven by comparisons of their responses to different topics compared to their own baseline (neutral) reactions. It also uses comparisons of an individual’s reactions to stimulus pictures with nationally based norms. Face2Face’s integrated neural-network machine learning is designed to continuously improve its accuracy in those areas.

Moreover, it's machine learning is not limited to self-correction. It is designed to compile an ever-expanding database and set of emotional interactions and behaviors it can detect or predict. As its store of data grows it will be able to create algorithms for almost any purpose.

Click here for the Future Life White Paper on Face2Face.

Click here for a two page summary of a rigorous and reliable scientific study of Face2Face Deception Detection capabilities focusing on fear.


National Suicide Prevention Hotline Number: 988