Affdex Facial Coding
The challenge marketers face with consumer response...
Marketers recognize that emotion drives brand loyalty and purchase decisions. Yet, traditional ways of measuring emotional response - surveys and focus groups - create a gap by requiring viewers to think about and say how they feel. Neuroscience provides insight into how the mind works, but it typically requires expensive, bulky equipment and lab-type settings that limit and influence the experience.
Delivering true emotion insight
Affdex is an award-winning neuromarketing tool that reads emotional states such as liking and attention from facial expressions using an ordinary webcam...to give marketers faster, more accurate insight into consumer response to brands, advertising and media. It uses automated facial expression analysis recognition, also called facial coding, to analyze your face and interpret your emotional state. Offered as a cloud-based software-as-a-service, Affdex is a fast, easy and affordable to add into existing studies. MIT-spinoff Affectiva has some of the best and brightest emotion experts behind the Affdex platform science, providing the most accurate measurement today. This ongoing investment in research and development is focused not just on measuring, but also on predicting...which ads will really work to drive sales and build brands.
History of Facial Coding. Foundation of Affdex
The research behind facial coding dates back to early studies by Darwin that concluded expressions are universal & are even shared with animals. Paul Ekman (1972) confirmed universality of 6 core expressions and popularized a facial action coding system (FACS) that is used to consistently describe facial expressions & movements. This system has received wide scale use, and has even been seen in the popular TV show, Lie to Me.
Powering Emotional Insight
The core of the Affdex science is composed of patented emotion algorithms that are state-of-the-art, having been put to the test in over a thousand studies all over the globe. These machine learning, computer vision algorithms take face videos as inputs and provide frame-by-frame emotion metrics as outputs.
Hover over the image or click each section for details on the 3 pillars of our science
Robust Global Real World Relevant
Beyond the Basic 6 EmotionsGather the most relevant emotional expressions for ad & video testing. Click for More Info...
Multi-Modal MeasuresInclude head gestures, eye tracking & other physiological responses into the emotional algorithms. Click for More Info...
Discrete Emotional Measures & Continuous Dimensions of EmotionUnderstand both to get a more complete picture of an emotional state. Click for More Info...
Precision Recall Global Validation Strong IP
Globally ValidatedProviding cross-cultural adaptability, enabling a system to be utilized throughout the world. Click for More Info...
Posed vs. Spontaneous, "real world" dataIn real world environments, emotions occur naturally and are not posed. Click for More Info...
Robust AlgorithmsAffectiva’s emotion algorithms leverages an appearance-based model. Click for More Info...
Scalable Unobtrusive Cost-Effective
Amazon Cloud ComputingEnables cost-effective, quick processing of large datasets, with both security and scale. Click for More Info...
Internet & Offline OptionsCaptures emotional response anywhere, whether reliable internet is available or not. Click for More Info...
Continuous Machine LearningCreates better, more accurate measures, in combination with our vast face repository. Click for More Info...
Emotion Measures Beyond the Basic 6 Emotions
Affdex delivers both discrete and dimensional emotion metrics:
Affdex leverages our extensive facial video repository to guide and prioritize development of new measures, selecting those that are the most relevant in real life.
Rigorous Science for Real World Use – Accuracy Global Validation
In order to produce highly accurate emotion metrics, Affdex classifiers are trained on both posed and spontaneous, naturally occurring data. In addition to training, we also test our classifiers on both posed and spontaneous data. We insist that our classifiers perform at 80-90% or better to reach production use.
Validation studies have also been completed in Asia, Latin America, US and UK to confirm that key facial expressions are universal. In these studies, we also uncovered that while expressions were universal, the magnitude of the expressions varied greatly. This emphasized the need for market-specific normative data, which is part of every Affdex study.
Continuous Improvement - Platform for classifier training & validation
Framework for labeling & training new AU's to improve accuracy (repeatable & scalable classifier creation & testing)
Largest repository of spontaneous facial expressions (over 50,000,000 real world face frames)
Significant investment in science infrastructure, a dedicated labeling team and automation for classifier creation and testing
News Article | Posted on 12/23/2013
2013 Review: The biggest success stories
The reviews of 2013 so far have had some common themes, with big data and mobile featuring heavily. These topics also cropped up in contributors’ views on the biggest success stories of the year, but there was plenty of variation besides. Another technology-related shout-out went to Affectiva. “Affectiva’s facial coding technology has got to be one of the biggest winners this year. Adopted by Millward Brown as a standard feature in their global link tests and presenting at every conference going.” Jon Puleston, GMI
News Article | Posted on 12/10/2013
Facial Coding Technology Helps Marketers Optimize Ad Effectiveness
Facial expressions can reveal what people do not say. And that is the premise that the Millward Brown Link with Affdex Facial Coding technology uses to gauge viewers’ true reactions to pre-launched advertisements.
News Article | Posted on 10/28/2013
Startup Gets Computers to Read Faces, Seeks Purpose Beyond Ads
A technology for reading emotions on faces can help companies sell candy bars. Now its creators want to see whether it can take on bigger problems in areas including education.
News Article | Posted on 5/24/2013
One Day Your Phone Will Know If You’re Happy or Sad
But what if these devices could really read our emotions? What if they could interpret every little gesture, every facial cue so that they can gauge our feelings as well as–maybe better than–our best friends? And then they respond, not with information, but what might pass for empathy.
News Article | Posted on 4/8/2013
Affectiva presenting at Esomar Congress 2013!
"Ads that evoke emotions are more entertaining and memorable, but do they really drive product sales? Using Facial Coding to Understand The Relation between Emotional Ads and Sales Effectiveness.
In a recent study, Affectiva's world’s largest data repository ties emotions in ads to real-world sales effectiveness. Using facial coding, their researchers crowdsource emotional responses of 1000+ viewers to over 140 ads in 4 countries and across 4 product categories - distilling sales data associated for each ad. They build and validate models that identify the facial measures and emotion trajectories that are most predictive of sales performance.
News Article | Posted on 2/20/2013
For the first time, brands will be able to understand and react in real-time to users’ emotional responses to their online advertising
Affdex uses Computer Vision and Machine Learning algorithms, developed in partnership with MIT, to detect facial expressions and head gestures obtained from webcams or mobile cameras. It assesses, analyses and interprets the user’s reactions to content to detect the full range of emotions from joy, discomfort and indifference to rapt engagement.
News Article | Posted on 2/19/2013
Affectiva Inks Deal With Ebuzzing Social To Integrate Face Tracking & Emotional Response Into Online Video Ad Analytics
The deal will mean that companies like Heineken and Red Bull will be able to track in real time how users are responding to their advertisements in the wild. That will not only mean taking away the need to run special market research sessions on limited groups of users, but potentially introduces new ways of measuring how effective a video has been, moving away from more traditional online ad metrics like page views and dwell time.
News Article | Posted on 2/2/2013
Building better Super Bowl ads by watching you watch them
A 3-year-old company takes technology from MIT's Media Lab and applies it to ad testing. Welcome to the future of advertising, where the wisdom of spending a reported $4 million for a 30-second spot in the Super Bowl doesn't have to be left to the imagination of an ad agency's creative team and the honesty of focus groups.
News Article | Posted on 1/29/2013
Creating the Perfect Super Bowl Ad
Are advertisers striking the right balance between entertaining and promoting their brands? Could companies entertain less and get consumers to buy?
News Article | Posted on 1/18/2013
Let the NeuroGames Begin
Neurogaming is set to disrupt the gaming industry and transform the gaming experience over the next 5 yrs. Companies, like Affectiva, are using a variety of techniques such as analyzing facial expressions, to understand emotional reactions within the game design process.