Top Banner
For the first time ever, brands will be able to understand and react in real-time to users’ emotional responses to their online advertising. Having integrated facial coding software into our platform we can scientifically measure users’ emotions as they view video ads, enabling us to identify and predict which content will generate the best emotional responses. The technology, which has a database of over 300 million face frames, uses Computer Vision and Machine Learning algorithms developed in partnership with MIT to detect facial expressions and head gestures obtained from webcams or mobile cameras. It assesses, analyses and interprets the user’s reactions to content to detect the full range of emotions from joy, discomfort and indifference to rapt engagement. MAPPING OF DATA By using this emotional data and overlaying Ebuzzing Social’s video performance data we have deep insight into the statistical correlations between human emotions and video interactions, such as shareability. This means that we are able to predict the success of your video based on both performance and emotion benchmarks. WHY USE THIS TECHNOLOGY? ¡ Identify the emotional reaction to your video content either before, during or after the creative process. ¡ Predict the performance & shareability of your video based on our model. ¡ Benchmark the performance of your video campaign versus your competitors and industry and vertical norms. BUZZINDEX POWERED BY EBUZZING SOCIAL WWW.EBUZZING.COM
2

BuzzIndex

Mar 25, 2016

Download

Documents

Ebuzzing

Introducing BuzzIndex, Ebuzzing Social's unique emotion sensing and performance based technology
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: BuzzIndex

For the first time ever, brands will be able to understand and react in real-time

to users’ emotional responses to their online advertising. Having integrated facial

coding software into our platform we can scientifically measure users’ emotions

as they view video ads, enabling us to identify and predict which content will

generate the best emotional responses.

The technology, which has a database of over 300 million face frames, uses Computer Vision and Machine

Learning algorithms developed in partnership with MIT to detect facial expressions and head gestures obtained

from webcams or mobile cameras. It assesses, analyses and interprets the user’s reactions to content to detect

the full range of emotions from joy, discomfort and indifference to rapt engagement.

MAPPING OF DATA

By using this emotional data and overlaying

Ebuzzing Social’s video performance data we

have deep insight into the statistical correlations

between human emotions and video interactions,

such as shareability. This means that we are able

to predict the success of your video based on both

performance and emotion benchmarks.

WHY USE THIS TECHNOLOGY?

¡ Identify the emotional reaction to your video

content either before, during or after the

creative process.

¡ Predict the performance & shareability of your

video based on our model.

¡ Benchmark the performance of your video

campaign versus your competitors and industry

and vertical norms.

BUZZINDEXPOWERED BY EBUZZING SOCIAL

WWW.EBUZZING.COM

Page 2: BuzzIndex

WWW.EBUZZING.COM