This is part of a series on machine intelligence companies. We interviewed BeagleMarianaBeyond VerbalPreteckt, and Eigen Innovations. Now we’re featuring Affectiva, which analyzes human emotions.

Machines don’t know what it’s like to have feelings. Yet for most of us, emotions are the prism through which we view our lives.

What seems like a benign request when we’re in a good mood can seem like nagging when we’re stressed. We tolerate jokes from our loved ones that we’d never suffer from someone we can’t stand.

Emotions are core to the so-called human experience.

Affectiva, an emotional intelligence company spun out of the illustrious MIT Media Lab research laboratory, is teaching artificial intelligence systems to view the world through that same prism.

In the process, it’s making it easier than ever for developers to build their products around a deep understanding of how people feel at any given moment, thanks to a series of public SDKs.

Machine Intelligence Affectiva: Gabi Zijderveld

We interviewed Gabi Zijderveld, Affectiva’s Vice President of Marketing and Product Strategy, to learn more about how the company plans to teach machines to understand our wily emotions. 

Jay Turcot, Affectiva’s Director of Applied Science, also provided insight into the company’s science and use of machine learning.

[Editor’s note: The interview questions and responses below have been edited for clarity and length.]

What’s your role at Affectiva and what does that entail?

My role is VP of Marketing and Product Strategy. 

At the highest level, what that entails is all the marketing functions — including PR, media relations, communications, events, messaging, positioning, digital marketing campaigns, website, and content development. 

I’m also involved in product strategy, so basically I lead the product management function.

We’re a startup, and we have a relatively small team. So many of the people who work here wear a variety of different hats.

Machine Intelligence Affectiva: Affectiva Team

What did you do before you joined Affectiva?

I’ve worked in high tech for over 20 years in a variety of roles encompassing product management, marketing, and channel management.

I’ve done a lot to help US-based software companies grow their customer base outside the US. I’ve worked predominantly at software companies. But I’ve also been at IBM, and I worked on the hardware side at their house.

I’ve been around the block in high tech for quite some time. That diversity of experience is especially important when you’re at a startup and you’re a small team, because effectively what we did a few years ago was set out to build a completely new company.

We’re building a new market, defining a new product category, and building new technology. And my personal role hovers from the extremely strategic (what is the vision for the future and how do you define emotion AI?)  to the inanely tactical.

Sometimes my workday can be ridiculously tactical. Right now I don’t have an intern who’s doing social media for me. So if I want a tweet to go out, I’m the one tweeting it.

I have a lot of responsibility. I’m basically setting the future but also getting my hands very dirty every single day, which can be tough, but which actually I really, really like.

What’s the most difficult thing about your role at Affectiva?

The most difficult thing — and I don’t mean it in a cliche way — is balancing the workload.

There are so many moving parts and so many requests; I’m permanently backlogged on email. We’re a very small team and there’s basically too much going on for us to handle.

It’s a mix of prioritizing and time management, while knowing you never have enough time.

That’s a good problem to have, and over time it will get rectified. But right now that’s a daily challenge.

Machine Intelligence Affectiva: Affectiva Team at Hackathon

What is “emotional intelligence”? How is it quantified and analyzed?

In a nutshell, our software measures facial expressions of emotion.

In order to do that, all we need is our software running either in the cloud or on the device and access to a web cam or device camera. We’re all about opt-in and consent, of course; we want people to know we’re recording their faces and to say they’re okay with that.

Our algorithms identify the face and pinpoint key landmarks on the face — tip of your nose, corner of your eyebrows, chin — and then measure and analyze gradations at the pixel level to measure facial muscle movement.

From there our software measures different facial expressions, which are then mapped to emotions. Our algorithms are tested and trained using our extensive emotion database.

Machine Intelligence Affectiva: Dan McDuff AffdexMe

Our database has information from over 4 million faces from 75 countries, all of which have been measured and analyzed frame-by-frame, second-by-second.

We have over 50 billion data points. We are the only company with an emotional data repository of that size.

This allows us to build norms and benchmarks. In media and advertising, for example, these norms let our clients compare performance of their ads to that of their competitors.

Our emotion database also gives us some really unique insight into how people respond emotionally in a digital context.

What are the benefits of analyzing someone’s emotional state?

Some of our clients use our software to do advertising testing.

Say they have a panel of paid analysts who are asked to watch a video online, wherever they are in the world. They consent to having their webcam turned on, and then our software analyzes frame-by-frame how they respond to what they’re seeing.

This data gives these firms the opportunity to optimize, change, and improve their ads, and also determine how to allocate their media spend.

Affdex Market Research Dashboard

We also license our tech to developers through free SDKs (which are accessible here) so their products can sense and analyze emotion in real time. Our tools are available for iOS, Android, Windows, Linux, and OS X, and anyone is welcome to use them.

That’s extremely powerful because it allows us to enter into new markets.

We are never going to be healthcare experts or education experts or automotive experts. But there are plenty of companies in those spaces that can integrate our tech into their own tools to augment what they’re doing.

On a technical level, what’s the hardest aspect of emotion analysis?

One of the hardest parts about emotion analysis is that there’s so little real-world emotion data. 

Affectiva had to be a trailblazer, not just in emotion technology but also in understanding how things like cross-cultural differences occur.

As we learn more about human emotion, we hope to use that knowledge to help all the users of our SDK and our partners.

What might people expect from Affectiva in the future?

Just yesterday, we announced $14M in growth capital in addition to $20M raised prior. This was a round led by Fenox Venture Capital, and investors include IT Services giant CAC Holdings, Bandai Namco, and Sega.

We will continue to make our technology easily accessible to any developer that wants to “emotion-enable” their own apps, digital experiences, or whatever solution they’re building.

This will enable us to rapidly enter into new verticals (like education, healthcare, automotive, robotics, HR, etc.), in which our technology will be used to build engaging, authentic experiences.  

We also believe that in the near future we will see the emotion chip, where our technology is embedded in device chips (very much like the GPS locator is available today).

Machine Intelligence Affectiva: Product in Action

What’s the most exciting trend in machine learning/AI from Affectiva’s perspective?

The most exciting trend has been the mainstream uses of machine learning that have emerged in recent years built on top of deep networks.

It seems that the applications of machine learning have advanced drastically and are being applied to change so many industries. The move of machine learning into mainstream products signals the beginning of a very exciting time.

Right now, what’s really interesting is that in general the narrative around AI has picked up again because there’s a lot of really practical applications on the market today. There are a number of companies and vendors that are developing really interesting capabilities.

But our premise is that, if there are artificial intelligence systems, especially those which are designed to interface with humans, they have to have an appreciation and understanding of human emotions.

They have to be able to sense human emotion and be able to adapt their responses and reactions to them. We believe that, at least as of today, these artificial intelligence systems are missing these abilities.

What advances in machine learning have benefitted Affectiva the most?

Methods that are able to leverage large amounts of data.

At Affectiva, we have a large and growing database that we can leverage to achieve increased accuracy, which has really let us improve our technology.

Machine Intelligence Affectiva: Product in Action 2

Are there any limitations on machine learning that Affectiva would like to see removed?

The reliance on large amounts of labeled (or annotated) data.

In many tasks, large amounts of data are used to make learned models robust and generalized to all the variations that occur in the real world.

When you compare this to the ability of children and their ability to generalize and learn, you see that there is still much more that is possible.

For example, a child will be able to recognize a giraffe in a zoo despite having only seen a handful of cartoon depictions of giraffes. Machines would require thousands of photographs of giraffes to learn what a giraffe looks like. Even then, a machine would not be able to recognize a cartoon giraffe with what it had learned.

Does Affectiva consider itself a machine learning company?

Absolutely; the core of Affectiva’s technology relies heavily on machine learning.

In situations where machines perform human-like tasks, inferences, or behaviors, the underlying technology is almost always machine learning.

What are the differences between machine learning, deep learning, and AI?

AI is the broadest area of research that covers machines capable of demonstrating intelligence.

This area includes fields of research such as computer vision (understanding images/video), natural language processing (understanding language), and machine learning.

In some cases, the term AI actually refers to AGI (Artificial General Intelligence), which is more akin to the kind of sentient machines you see in movies.

Machine learning is the science of writing algorithms that can learn from examples, in order to perform a task. The alternative would be to create rule-based algorithms based on rules, whether those rules are grounded by intuition or by science (i.e. physics).

Many subfields of AI, such as computer vision and natural language processing, now rely heavily on machine learning.

Deep learning is an area of research within machine learning in which the algorithms that are used create models that have many layers.

Typically these are neural networks, where each layer is relatively simple. However, when you use many of these layers, one on top of the other, you can start modeling things that are very complex. A network with just one layer would be considered shallow.

Machine Intelligence Affectiva: Product in Action 3

We recently featured Beyond Verbal, which also works on emotional intelligence. Do you see it as a competitor?

We know Beyond Verbal well. We’re quite friendly with them. They actually participated in our first-ever hackathon just a few months ago, during which we made our tech available to a whole bunch of people over a single weekend.

Ultimately we tackle some of the same problems, in terms of automating and quantifying how people respond emotionally. But they do “voice” and we do “face.”

There are cases where “face” is more accurate than “voice.” But there are also times when you can’t analyze someone’s face, so analyzing their voice is the only option.

We don’t really see it as a competition because typically the use cases are still quite different. There’s a bit of overlap, but to the best of my knowledge, we haven’t directly competed with them.

Right now we see our products more as adjacent to each other, and we think that there might be opportunities for us to partner with companies like Beyond Verbal in the future.

 

This post is part of a series of interviews with machine intelligence companies that are harnessing the power of machine learning and artificial intelligence in innovative ways. Stay tuned for more!

Ready to start your project?

Learn how ThinkApps can get your product launched faster, better, and with more value than you knew was possible.