Eye tracking and facial decoding – how does it work?
An important stage in the Maker Studio process is the online testing of content proposals with biometric technologies (eye tracking and facial decoding), in which a representative panel of GenZ participates.
But how do these technologies work exactly?
Let’s first talk about eye tracking.
Spoiler alert: we go into some technical details. The light from the infrared cameras is directed at the pupils of the participants in the panel, causing reflections in both pupils and corneas. These reflections, also known as pupil centre corneal reflections (PCCR), can provide information about eye movement and direction.
In academia, researchers use information about eye movements and gaze focus to assess attentional processes. The aim is to compare the behaviour of a group, measure stimulus-induced visual responses and more.
We use eye tracking to better understand customer experience and the performance of proposed ideas by measuring visual attention. This is peak video viewing, placement and branding, package design and more.
iMotions Soft
The iMotions Software allows us to use any type of simulations, whether we are analysing images, videos, augmented reality & VR or any other type of content. It allows 100% remote test set-up and data collection, so people on the panel can participate from anywhere.
The most useful functionality of eye tracking is the heatmap. These are visualisations that show the general distribution of the points on which the eye is fixed in a video or static material. These are usually displayed as a colour gradient superimposed over the image or stimulus presented. The colours red, yellow and green represent, in descending order, the amount of gaze points that have been directed to certain parts of the material being analysed.

Facial Decoding
Let’s see how facial decoding works. One of the most powerful indicators of emotion is our face. When we laugh or cry, we bring emotions to the surface. This allows others to peer into our minds, “reading” our face based on changes in key facial features. Facial coding is the process by which we measure human emotions through facial expressions. With facial expression analysis we can test the impact of any content, product or service that aims to elicit an emotion and facial responses. It uses 20 expression measures and 7 main emotions (joy, anger, fear, disgust, contempt, sadness and surprise).
Specifically, we can analyse any type of content and come up with suggestions to improve the material.
For example, this is what a heatmap eye tracking test looks like on one of Maker Studio’s projects, Philadelphia Pancakes, with Andreea Corb:
We used heatmap to visualize which elements attract more attention and which ones have a high emotional impact. We compared the reactions of individual respondents and found the following:
- The video generated very good appeal for the product and highlighted the brand and attracted the attention of test participants;
- Although tested alongside other shorter videos (including cat videos), it was the most engaging of the test, even if it is longer;
- The most engaging moment of the video: the “I stole some cream to make a sandwich” moment. It demonstrated the versatility of the product in a natural, unscripted content context.

That was a short taste. At Maker Studio, in addition to biometric testing of short video content, we can also provide insights on:
- Selecting the best content creator for your brand based on analysis of the content they generate in their communities;
- Analyze your own social media content to understand how engaging it is.