  {"id":12035,"date":"2020-04-20T10:48:27","date_gmt":"2020-04-20T14:48:27","guid":{"rendered":"https:\/\/digital.hbs.edu\/platform-digit\/submission\/affectiva-building-ai-that-can-read-human-emotions\/"},"modified":"2020-04-20T10:48:27","modified_gmt":"2020-04-20T14:48:27","slug":"affectiva-building-ai-that-reads-human-emotions","status":"publish","type":"hck-submission","link":"https:\/\/d3.harvard.edu\/platform-digit\/submission\/affectiva-building-ai-that-reads-human-emotions\/","title":{"rendered":"Affectiva: building AI that reads human emotions"},"content":{"rendered":"<p>As the amount of data we collect increases, companies are leveraging advanced analytics to drive insights also from the most unusual sources of data, including people facial expressions or speech. In fact, an interesting application of artificial intelligence (AI) and machine learning (ML) lies in the space of emotions. Emotion AI is an emotion detection technology, that \u201c<em>combines analysis of both face and speech as complementary signals to provide richer insight into the human expression of <\/em><em>emotion<\/em>\u201d. <a href=\"#_ftn1\" name=\"_ftnref1\">[1]<\/a> Affectiva, a Boston-based company founded in 2009 as a spin-off from the MIT Media Lab, is a pioneer in this space.<a href=\"#_ftn2\" name=\"_ftnref2\">[2]<\/a><\/p>\n<p>By using a standard webcam, Affectiva can identify a face and its key landmarks (e.g. mouth corners, nose tip etc.), to classify facial expressions into seven main emotions (anger, contempt, disgust, fear, joy, sadness and surprise). With pre-recorded audio, speech detection can be integrated as well, with the ability to classify \u201chow\u201d something is said, with a frequency of few hundred millisecond. With about six million faces analyzed in 87 Countries, data accuracy is in the high 90<sup>th<\/sup> percentile.<\/p>\n<p><iframe loading=\"lazy\" title=\"Rana el Kaliouby: This app knows how you feel -- from the look on your face\" src=\"https:\/\/embed.ted.com\/talks\/rana_el_kaliouby_this_app_knows_how_you_feel_from_the_look_on_your_face\" width=\"640\" height=\"361\" frameborder=\"0\" scrolling=\"no\" webkitAllowFullScreen mozallowfullscreen allowFullScreen><\/iframe><\/p>\n<p><strong>Current business model and future opportunities<\/strong><\/p>\n<p>The two core applications of emotion AI for Affectiva are in the space of media analytics and automotive. The media analytics application is targeted at advertisers that want to test their consumers\u2019 reaction to videos, ads and TV shows.<a href=\"#_ftn3\" name=\"_ftnref3\">[3]<\/a>\u00a0 With such information, it is possible to improve ad story flow, trailer creation or analyze ad character perception. Results at this stage are extremely promising, also considering the partnerships with Coca Cola and Mars,<a href=\"#_ftn4\" name=\"_ftnref4\">[4]<\/a> as well as the strategic investments made by Kantar (subsidiary of WPP, the largest advertising conglomerate).<a href=\"#_ftn5\" name=\"_ftnref5\">[5]<\/a> \u00a0In automotive, Affectiva developed an In-Cabin Sensing (ICS) that understand emotions within a vehicle, in real time. <a href=\"#_ftn6\" name=\"_ftnref6\">[6]<\/a> \u00a0This application can support the safety of passengers, for example for ridesharing providers or fleet management companies, by detecting driver impairment. Also in this business vertical, partnerships are quite strong, like in the case of Accenture and Faurecia, as well as with other software or product safety companies.<a href=\"#_ftn7\" name=\"_ftnref7\">[7]<\/a><a href=\"#_ftn8\" name=\"_ftnref8\">[8]<\/a><\/p>\n<p>Overall, the total addressable market for affective computing, across these two industries, as well as considering all the other industries where they could expand, can be huge. Tractica estimated that the AI software market may reach $118.6 billion by 2025.<a href=\"#_ftn9\" name=\"_ftnref9\">[9]<\/a> While these projections account for all the different AI use-cases, emotion AI can capture a significant share of such exponential growth, adding emotion component many AI use cases and making them more human-like.<\/p>\n<p>Affectiva uses clear monetization strategy to leverage on both existing and new opportunities &#8211; it offers SaaS services to current customer groups (advertisers and OEMs) and SDKs (software development kits) that allow developers to build new applications that leverage Affectiva\u2019s core technology (now through iMotions).<a href=\"#_ftn10\" name=\"_ftnref10\">[10]<\/a><\/p>\n<p><strong>Key risks and mitigation strategies<\/strong><\/p>\n<p>Going forward, there are two main challenges for a full global scale up of Affectiva.<\/p>\n<p>One challenge is \u2018technical\u2019, as cultural norms and how people express emotions are very different across different geographies, cultures and product categories. To obtain robust results, the company will need to work across multiple product\/geography\/culture combinations (plus any other variable that may be relevant), to define the right descriptive benchmarks. To scale up the platform is a structured and sustainable way, a modular expansion may be needed, focusing first on customer groups with more data available and then expanding in a modular way (before expanding immediately to mass users). <a href=\"#_ftn11\" name=\"_ftnref11\">[11]<\/a><\/p>\n<p>The second challenge is \u2018ethical\u2019, given the lack of regulations and the fear emotion AI can drive discrimination like in the case of predictive sentencing or housing algorithms. Many activists are trying to boycott facial recognition technologies and some Countries are already assessing the opportunity to ban them, considering the impact they can have on people\u2019s lives, as well as all the unwanted consequences. To mitigate such risks, Affective will need to work with authorities to define the right security and privacy protocols, at industry level.<a href=\"#_ftn12\" name=\"_ftnref12\">[12]<\/a><\/p>\n<p>While the first challenge can be overcome by collecting more data and expanding research in partnership with multiple brands or overall customers, the second challenge is more crucial to the survival or expansion of the company. More specifically, assuming an effective lobby with institutions can ensure survival, Affectiva will need to tackle its main challenge in terms of growth. While opening up again SDKs to developers may trigger new use cases, sources of revenues and potentially triggering network effects, this solution would diminish Affectiva\u2019s control on how the technology is being used. This may accelerate growth but at the same time it can bring Black Mirror potential evolutions closer.<\/p>\n<p><a href=\"#_ftnref1\" name=\"_ftn1\">[1]<\/a> https:\/\/www.affectiva.com\/emotion-ai-overview\/<\/p>\n<p><a href=\"#_ftnref2\" name=\"_ftn2\">[2]<\/a> https:\/\/www.crunchbase.com\/organization\/affectiva#section-overview<\/p>\n<p><a href=\"#_ftnref3\" name=\"_ftn3\">[3]<\/a> https:\/\/www.affectiva.com\/product\/affdex-for-market-research\/<\/p>\n<p><a href=\"#_ftnref4\" name=\"_ftn4\">[4]<\/a> https:\/\/www.forbes.com\/sites\/samarmarwan\/2018\/11\/29\/affectiva-emotion-ai-ceo-rana-el-kaliouby\/#5d6441821572<\/p>\n<p><a href=\"#_ftnref5\" name=\"_ftn5\">[5]<\/a> https:\/\/www.wpp.com\/news\/2011\/07\/kantar-makes-strategic-investment-in-affectiva<\/p>\n<p><a href=\"#_ftnref6\" name=\"_ftn6\">[6]<\/a> http:\/\/go.affectiva.com\/auto<\/p>\n<p><a href=\"#_ftnref7\" name=\"_ftn7\">[7]<\/a> https:\/\/newsroom.accenture.com\/news\/accenture-faurecia-and-affectiva-team-to-develop-the-car-cabin-of-the-future.htm<\/p>\n<p><a href=\"#_ftnref8\" name=\"_ftn8\">[8]<\/a> https:\/\/xconomy.com\/boston\/2018\/03\/21\/affectiva-launches-a-i-tech-to-help-cars-sense-your-emotions\/<\/p>\n<p><a href=\"#_ftnref9\" name=\"_ftn9\">[9]<\/a> https:\/\/tractica.omdia.com\/newsroom\/press-releases\/artificial-intelligence-software-market-to-reach-118-6-billion-in-annual-worldwide-revenue-by-2025\/<\/p>\n<p><a href=\"#_ftnref10\" name=\"_ftn10\">[10]<\/a> https:\/\/imotions.com\/contact-us\/<\/p>\n<p><a href=\"#_ftnref11\" name=\"_ftn11\">[11]<\/a> https:\/\/www.affectiva.com\/wp-content\/uploads\/2017\/03\/Does_Facial_Coding_Generalize_Across_Cultures_ASIA.pdf<\/p>\n<p><a href=\"#_ftnref12\" name=\"_ftn12\">[12]<\/a> https:\/\/www.technologyreview.com\/2020\/02\/14\/844765\/ai-emotion-recognition-affective-computing-hirevue-regulation-ethics\/<\/p>\n","protected":false},"excerpt":{"rendered":"<p>As the amount of data we collect increases, companies are leveraging advanced analytics to drive insights also from the most unusual sources of data, including people facial expressions or speech. In fact, an interesting application of artificial intelligence (AI) and [&hellip;]<\/p>\n","protected":false},"author":11664,"featured_media":12038,"comment_status":"open","ping_status":"closed","template":"","categories":[2548,41],"class_list":["post-12035","hck-submission","type-hck-submission","status-publish","has-post-thumbnail","hentry","category-emotion-ai","category-emotion-analytics","hck-taxonomy-organization-affectiva","hck-taxonomy-industry-information-technology","hck-taxonomy-country-united-states"],"connected_submission_link":"https:\/\/d3.harvard.edu\/platform-digit\/assignment\/competing-with-data-and-ai-challenge\/","yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.3 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Affectiva: building AI that reads human emotions - Digital Innovation and Transformation<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/d3.harvard.edu\/platform-digit\/submission\/affectiva-building-ai-that-reads-human-emotions\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Affectiva: building AI that reads human emotions - Digital Innovation and Transformation\" \/>\n<meta property=\"og:description\" content=\"As the amount of data we collect increases, companies are leveraging advanced analytics to drive insights also from the most unusual sources of data, including people facial expressions or speech. In fact, an interesting application of artificial intelligence (AI) and [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/d3.harvard.edu\/platform-digit\/submission\/affectiva-building-ai-that-reads-human-emotions\/\" \/>\n<meta property=\"og:site_name\" content=\"Digital Innovation and Transformation\" \/>\n<meta property=\"og:image\" content=\"https:\/\/d3.harvard.edu\/platform-digit\/wp-content\/uploads\/sites\/2\/2020\/04\/affectiva-logo.png\" \/>\n\t<meta property=\"og:image:width\" content=\"3074\" \/>\n\t<meta property=\"og:image:height\" content=\"1700\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data1\" content=\"4 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-digit\\\/submission\\\/affectiva-building-ai-that-reads-human-emotions\\\/\",\"url\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-digit\\\/submission\\\/affectiva-building-ai-that-reads-human-emotions\\\/\",\"name\":\"Affectiva: building AI that reads human emotions - Digital Innovation and Transformation\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-digit\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-digit\\\/submission\\\/affectiva-building-ai-that-reads-human-emotions\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-digit\\\/submission\\\/affectiva-building-ai-that-reads-human-emotions\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-digit\\\/wp-content\\\/uploads\\\/sites\\\/2\\\/2020\\\/04\\\/affectiva-logo.png\",\"datePublished\":\"2020-04-20T14:48:27+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-digit\\\/submission\\\/affectiva-building-ai-that-reads-human-emotions\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/d3.harvard.edu\\\/platform-digit\\\/submission\\\/affectiva-building-ai-that-reads-human-emotions\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-digit\\\/submission\\\/affectiva-building-ai-that-reads-human-emotions\\\/#primaryimage\",\"url\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-digit\\\/wp-content\\\/uploads\\\/sites\\\/2\\\/2020\\\/04\\\/affectiva-logo.png\",\"contentUrl\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-digit\\\/wp-content\\\/uploads\\\/sites\\\/2\\\/2020\\\/04\\\/affectiva-logo.png\",\"width\":3074,\"height\":1700},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-digit\\\/submission\\\/affectiva-building-ai-that-reads-human-emotions\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-digit\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Submissions\",\"item\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-digit\\\/submission\\\/\"},{\"@type\":\"ListItem\",\"position\":3,\"name\":\"Affectiva: building AI that reads human emotions\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-digit\\\/#website\",\"url\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-digit\\\/\",\"name\":\"Digital Innovation and Transformation\",\"description\":\"MBA Student Perspectives\",\"potentialAction\":[{\"@type\":\"性视界Action\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-digit\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Affectiva: building AI that reads human emotions - Digital Innovation and Transformation","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/d3.harvard.edu\/platform-digit\/submission\/affectiva-building-ai-that-reads-human-emotions\/","og_locale":"en_US","og_type":"article","og_title":"Affectiva: building AI that reads human emotions - Digital Innovation and Transformation","og_description":"As the amount of data we collect increases, companies are leveraging advanced analytics to drive insights also from the most unusual sources of data, including people facial expressions or speech. In fact, an interesting application of artificial intelligence (AI) and [&hellip;]","og_url":"https:\/\/d3.harvard.edu\/platform-digit\/submission\/affectiva-building-ai-that-reads-human-emotions\/","og_site_name":"Digital Innovation and Transformation","og_image":[{"width":3074,"height":1700,"url":"https:\/\/d3.harvard.edu\/platform-digit\/wp-content\/uploads\/sites\/2\/2020\/04\/affectiva-logo.png","type":"image\/png"}],"twitter_card":"summary_large_image","twitter_misc":{"Est. reading time":"4 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/d3.harvard.edu\/platform-digit\/submission\/affectiva-building-ai-that-reads-human-emotions\/","url":"https:\/\/d3.harvard.edu\/platform-digit\/submission\/affectiva-building-ai-that-reads-human-emotions\/","name":"Affectiva: building AI that reads human emotions - Digital Innovation and Transformation","isPartOf":{"@id":"https:\/\/d3.harvard.edu\/platform-digit\/#website"},"primaryImageOfPage":{"@id":"https:\/\/d3.harvard.edu\/platform-digit\/submission\/affectiva-building-ai-that-reads-human-emotions\/#primaryimage"},"image":{"@id":"https:\/\/d3.harvard.edu\/platform-digit\/submission\/affectiva-building-ai-that-reads-human-emotions\/#primaryimage"},"thumbnailUrl":"https:\/\/d3.harvard.edu\/platform-digit\/wp-content\/uploads\/sites\/2\/2020\/04\/affectiva-logo.png","datePublished":"2020-04-20T14:48:27+00:00","breadcrumb":{"@id":"https:\/\/d3.harvard.edu\/platform-digit\/submission\/affectiva-building-ai-that-reads-human-emotions\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/d3.harvard.edu\/platform-digit\/submission\/affectiva-building-ai-that-reads-human-emotions\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/d3.harvard.edu\/platform-digit\/submission\/affectiva-building-ai-that-reads-human-emotions\/#primaryimage","url":"https:\/\/d3.harvard.edu\/platform-digit\/wp-content\/uploads\/sites\/2\/2020\/04\/affectiva-logo.png","contentUrl":"https:\/\/d3.harvard.edu\/platform-digit\/wp-content\/uploads\/sites\/2\/2020\/04\/affectiva-logo.png","width":3074,"height":1700},{"@type":"BreadcrumbList","@id":"https:\/\/d3.harvard.edu\/platform-digit\/submission\/affectiva-building-ai-that-reads-human-emotions\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/d3.harvard.edu\/platform-digit\/"},{"@type":"ListItem","position":2,"name":"Submissions","item":"https:\/\/d3.harvard.edu\/platform-digit\/submission\/"},{"@type":"ListItem","position":3,"name":"Affectiva: building AI that reads human emotions"}]},{"@type":"WebSite","@id":"https:\/\/d3.harvard.edu\/platform-digit\/#website","url":"https:\/\/d3.harvard.edu\/platform-digit\/","name":"Digital Innovation and Transformation","description":"MBA Student Perspectives","potentialAction":[{"@type":"性视界Action","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/d3.harvard.edu\/platform-digit\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"}]}},"_links":{"self":[{"href":"https:\/\/d3.harvard.edu\/platform-digit\/wp-json\/wp\/v2\/hck-submission\/12035","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/d3.harvard.edu\/platform-digit\/wp-json\/wp\/v2\/hck-submission"}],"about":[{"href":"https:\/\/d3.harvard.edu\/platform-digit\/wp-json\/wp\/v2\/types\/hck-submission"}],"author":[{"embeddable":true,"href":"https:\/\/d3.harvard.edu\/platform-digit\/wp-json\/wp\/v2\/users\/11664"}],"replies":[{"embeddable":true,"href":"https:\/\/d3.harvard.edu\/platform-digit\/wp-json\/wp\/v2\/comments?post=12035"}],"version-history":[{"count":1,"href":"https:\/\/d3.harvard.edu\/platform-digit\/wp-json\/wp\/v2\/hck-submission\/12035\/revisions"}],"predecessor-version":[{"id":12037,"href":"https:\/\/d3.harvard.edu\/platform-digit\/wp-json\/wp\/v2\/hck-submission\/12035\/revisions\/12037"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/d3.harvard.edu\/platform-digit\/wp-json\/wp\/v2\/media\/12038"}],"wp:attachment":[{"href":"https:\/\/d3.harvard.edu\/platform-digit\/wp-json\/wp\/v2\/media?parent=12035"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/d3.harvard.edu\/platform-digit\/wp-json\/wp\/v2\/categories?post=12035"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}