  {"id":34074,"date":"2018-11-13T17:52:25","date_gmt":"2018-11-13T22:52:25","guid":{"rendered":"https:\/\/digital.hbs.edu\/platform-rctom\/submission\/could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound\/"},"modified":"2018-11-13T17:52:25","modified_gmt":"2018-11-13T22:52:25","slug":"could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound","status":"publish","type":"hck-submission","link":"https:\/\/d3.harvard.edu\/platform-rctom\/submission\/could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound\/","title":{"rendered":"Could Daft Punk Actually Be Robots? How Machine Learning Could Redefine Musical Creativity at Next Big Sound"},"content":{"rendered":"<p><img loading=\"lazy\" decoding=\"async\" class=\" alignright\" src=\"https:\/\/www.telegraph.co.uk\/content\/dam\/news\/2018\/08\/22\/TELEMMGLPICT000172233059_trans_NvBQzQNjv4Bqeo_i_u9APj8RuoebjoAHt7rBiwLVv-x2UIIDI2Y-giA.jpeg?imwidth=1400\" width=\"343\" height=\"343\" \/><\/p>\n<p>A portrait of Edmond Belamy, a fictional man, was created entirely by Artificial Intelligence and sold at auction in October 2018 for $432,500<sup>1<\/sup>. The sale raises interesting questions: what distinguishes art from data? Can the richness of creativity be emulated by the contents of a spreadsheet? In the realm of music, <em>Next Big Sound<\/em>, a recent aquiree of internet radio giant <em>Pandora<\/em>, employs a data-centric approach to artist growth and strategy. Where future value lies for the company is in applying a similar model to the underlying product: music itself.<\/p>\n<p><em>Next Big Sound<\/em> (NBS) has a clear value proposition for its users: leveraging data analytics and predictive algorithms to create actionable insights for music artists and labels. For Artists and Repertoire (A&amp;R) executives at major labels, whose primary responsibility is to sign undiscovered talent to record deals, there is value in knowing that \u201cmusicians who gain 20,000 to 50,000 Facebook fans in one month are four times more likely to eventually reach 1 million\u201d fans<sup>2<\/sup>. By analyzing past growth patterns of current successful artists, NBS develops predictive algorithms to spot early prospects. Armed with NBS\u2019s projections, labels can gain competitive advantage by improving their scouting processes and signing promising artists earlier on in their careers.<\/p>\n<p>From the artist perspective, the sheer number of explanatory variables on which to base career-driving decisions is staggering: Twitter mentions, SoundCloud streams, Spotify playlist adds, blog coverage, radio spins, iTunes downloads and more. By aggregating this data NBS can offer invaluable insights to artists: from optimal album release dates to expected tour ticket sales by demographic. Most of the data harvested by NBS is publicly available, and competitors could replicate these predictive metrics. Where NBS can differentiate its service (and has to some extent already) is in its recommendation algorithms.<\/p>\n<p><a href=\"https:\/\/d3.harvard.edu\/platform-rctom\/wp-content\/uploads\/sites\/4\/2018\/11\/Screen-Shot-2018-11-13-at-5.13.48-PM.png\"><img loading=\"lazy\" decoding=\"async\" class=\" wp-image-33769 alignleft\" src=\"https:\/\/d3.harvard.edu\/platform-rctom\/wp-content\/uploads\/sites\/4\/2018\/11\/Screen-Shot-2018-11-13-at-5.13.48-PM-1024x304.png\" alt=\"\" width=\"627\" height=\"186\" srcset=\"https:\/\/d3.harvard.edu\/platform-rctom\/wp-content\/uploads\/sites\/4\/2018\/11\/Screen-Shot-2018-11-13-at-5.13.48-PM-1024x304.png 1024w, https:\/\/d3.harvard.edu\/platform-rctom\/wp-content\/uploads\/sites\/4\/2018\/11\/Screen-Shot-2018-11-13-at-5.13.48-PM-300x89.png 300w, https:\/\/d3.harvard.edu\/platform-rctom\/wp-content\/uploads\/sites\/4\/2018\/11\/Screen-Shot-2018-11-13-at-5.13.48-PM-768x228.png 768w, https:\/\/d3.harvard.edu\/platform-rctom\/wp-content\/uploads\/sites\/4\/2018\/11\/Screen-Shot-2018-11-13-at-5.13.48-PM-600x178.png 600w\" sizes=\"auto, (max-width: 627px) 100vw, 627px\" \/><\/a>Predictions help capture existing value, recommendations create new sources of value. Forecasting ticket sales among students in Boulder, Colorado on a Thursday in November can be useful. But beyond city-specific predictions, NBS can also recommend potentially untapped markets that artists should be targeting. In an era where artist earnings from touring often exceed those from streaming royalties and sales, a well-timed, optimally routed tour can make an artist\u2019s career<sup>3<\/sup>.<\/p>\n<p>Serving strategic recommendations to artists is a competency that could fundamentally reshape the music industry and elevate NBS to juggernaut status. NBS simply must expand its offerings from logistical suggestions to creative directions. To maximize enterprise value, NBS should leverage its data to develop three core technologies: a feedback mechanism rooted in their \u201cpredictive success\u201d model<sup>4<\/sup>, a creative direction algorithm using target demographics as inputs, and a music-producing computer system.<\/p>\n<p>NBS can recommend which songs artists should select as their promotional \u201csingles\u201d based on past performance of similar songs<sup>5<\/sup>. Functionally, NBS\u2019s algorithms are accurately recognizing if a piece of music will be popular<sup>2<\/sup>. Consider this capability as a feedback service: artists upload an unreleased project (e.g. a partially finished musical demo) to the service, and NBS predicts whether the music will perform well commercially. Further, the algorithm could extract successful traits from popular songs and suggest what changes or additions artists should make to their works to improve commercial viability.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignleft\" src=\"https:\/\/buyourobot.b-cdn.net\/wp-content\/uploads\/edd\/2015\/09\/00511_robot_headphones_thumb.jpg\" width=\"413\" height=\"508\" \/>Taken a step further, NBS should develop a creative direction algorithm for artists by taking <em>Pandora<\/em>\u2019s listener recommendation engine and reversing it. <em>Pandora<\/em> offers ideal listening experiences to its users by \u201calgorithmically curat[ing playlists] based on an analysis of data from sensors in users\u2019 mobile devices, the users\u2019 previous music listening behaviour, users\u2019 relationships with other humans via social media, and acoustic characteristics of millions of songs available in the service\u2019s music library\u201d<sup>6<\/sup>. The same formula, applied in reverse, could be an invaluable service for artists: plug in their target demographic (e.g. ages 24-30, Swedish, college educated, female) and the algorithm dictates specific musical choices artists should make in their creation process to appeal to that demographic: instrument combinations, beats per minute, song length, rhythm syncopations, lyrical content and more. With these insights, artists could better cater music to their existing fan base, or specifically target new profitable demographics to spur growth.<\/p>\n<p>At the extreme, NBS could challenge the very existence of musical artists. If the company\u2019s data analytics can predict what type of music will be enjoyed by any demographic and which elements of this music stand to make a particular song successful: could it not harness these technologies to create its own music entirely with AI? These days, some major artists perform anonymously (Daft Punk) or as virtual bands (Gorillaz)<sup>7<\/sup>. If NBS were to successfully produce hit songs using only their prediction and recommendation algorithms, attributing the music to an anonymous or virtual act, would anyone know? If the next chart topper resulted from machine learning, would anyone be incensed? Perhaps Edmond Belamy has the answers.<\/p>\n<p>(Word Count 792)<\/p>\n<p>&nbsp;<\/p>\n<p><strong>Citations<\/strong><\/p>\n<p>[1] \u201cIs Artificial Intelligence Set to Become Art&#8217;s Next Medium?\u201d\u00a0<em>Christie&#8217;s<\/em>, 16 Oct. 2018, www.christies.com\/features\/A-collaboration-between-two-artists-one-human-one-a-machine-9332-1.aspx.<\/p>\n<p>[2] Greenburg, Zack O&#8217;Malley. \u201cMoneyball For Music: The Rise of Next Big Sound.\u201d\u00a0<em>Forbes<\/em>, Forbes Magazine, 13 Feb. 2013, www.forbes.com\/sites\/zackomalleygreenburg\/2013\/02\/13\/moneyball-for-music-the-rise-of-next-big-sound\/.<\/p>\n<p>[3] Titlow, John Paul. \u201cInside Pandora&#8217;s Plan To Reinvent Itself-And Beat Back Apple And Spotify.\u201d\u00a0<em>Fast Company<\/em>, Fast Company, 14 Apr. 2017, www.fastcompany.com\/3058719\/inside-pandoras-plan-to-reinvent-itself-and-beat-back-apple-and-sp.<\/p>\n<p>[4] U.S. Patent Application No. 14\/302\/200, Publication No. US 2015\/0032673 A1 (filed Jun. 11, 2014)(Victor HU, Alex WHITE, applicants).<\/p>\n<p>[5] Bonazzo, John. \u201cNext Big Industry to Embrace Moneyball: The Music Business.\u201d\u00a0<em>Observer<\/em>, Observer, 4 Jan. 2017, observer.com\/2017\/01\/pandora-next-big-sound-moneyball-music\/.<\/p>\n<p>[6] Wikstrom, Patrik (2015)\u2028Will algorithmic playlist curation be the end of music stardom? <em>Journal of Business Anthropology<\/em>, <em>4<\/em>(2), pp. 278-284.<\/p>\n<p>[7] Wired Staff. \u201cKeeping It (Un)Real.\u201d\u00a0<em>Wired<\/em>, Conde Nast, 27 July 2018, www.wired.com\/2005\/07\/gorillaz-2\/.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>In the realm of music, Next Big Sound employs a data-centric approach to artist growth and strategy. Where future value lies for the company is in applying a similar model to the underlying product: music itself.<\/p>\n","protected":false},"author":11787,"featured_media":34201,"comment_status":"open","ping_status":"closed","template":"","categories":[4365,4905,2122,346,4904],"class_list":["post-34074","hck-submission","type-hck-submission","status-publish","has-post-thumbnail","hentry","category-artifical-intelligence","category-daft-punk","category-data-analytics","category-machine-learning","category-next-big-sound","hck-taxonomy-organization-next-big-sound","hck-taxonomy-industry-music","hck-taxonomy-country-united-states"],"connected_submission_link":"https:\/\/d3.harvard.edu\/platform-rctom\/assignment\/rc-tom-challenge-2018\/","yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.3 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Could Daft Punk Actually Be Robots? How Machine Learning Could Redefine Musical Creativity at Next Big Sound - Technology and Operations Management<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/d3.harvard.edu\/platform-rctom\/submission\/could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Could Daft Punk Actually Be Robots? How Machine Learning Could Redefine Musical Creativity at Next Big Sound - Technology and Operations Management\" \/>\n<meta property=\"og:description\" content=\"In the realm of music, Next Big Sound employs a data-centric approach to artist growth and strategy. Where future value lies for the company is in applying a similar model to the underlying product: music itself.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/d3.harvard.edu\/platform-rctom\/submission\/could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound\/\" \/>\n<meta property=\"og:site_name\" content=\"Technology and Operations Management\" \/>\n<meta property=\"og:image\" content=\"https:\/\/d3.harvard.edu\/platform-rctom\/wp-content\/uploads\/sites\/4\/2018\/11\/dp-TOM.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1038\" \/>\n\t<meta property=\"og:image:height\" content=\"400\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data1\" content=\"5 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-rctom\\\/submission\\\/could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound\\\/\",\"url\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-rctom\\\/submission\\\/could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound\\\/\",\"name\":\"Could Daft Punk Actually Be Robots? How Machine Learning Could Redefine Musical Creativity at Next Big Sound - Technology and Operations Management\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-rctom\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-rctom\\\/submission\\\/could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-rctom\\\/submission\\\/could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-rctom\\\/wp-content\\\/uploads\\\/sites\\\/4\\\/2018\\\/11\\\/dp-TOM.jpg\",\"datePublished\":\"2018-11-13T22:52:25+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-rctom\\\/submission\\\/could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/d3.harvard.edu\\\/platform-rctom\\\/submission\\\/could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-rctom\\\/submission\\\/could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound\\\/#primaryimage\",\"url\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-rctom\\\/wp-content\\\/uploads\\\/sites\\\/4\\\/2018\\\/11\\\/dp-TOM.jpg\",\"contentUrl\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-rctom\\\/wp-content\\\/uploads\\\/sites\\\/4\\\/2018\\\/11\\\/dp-TOM.jpg\",\"width\":1038,\"height\":400},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-rctom\\\/submission\\\/could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-rctom\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Submissions\",\"item\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-rctom\\\/submission\\\/\"},{\"@type\":\"ListItem\",\"position\":3,\"name\":\"Could Daft Punk Actually Be Robots? How Machine Learning Could Redefine Musical Creativity at Next Big Sound\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-rctom\\\/#website\",\"url\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-rctom\\\/\",\"name\":\"Technology and Operations Management\",\"description\":\"MBA Student Perspectives\",\"potentialAction\":[{\"@type\":\"性视界Action\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/d3.harvard.edu\\\/platform-rctom\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Could Daft Punk Actually Be Robots? How Machine Learning Could Redefine Musical Creativity at Next Big Sound - Technology and Operations Management","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/d3.harvard.edu\/platform-rctom\/submission\/could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound\/","og_locale":"en_US","og_type":"article","og_title":"Could Daft Punk Actually Be Robots? How Machine Learning Could Redefine Musical Creativity at Next Big Sound - Technology and Operations Management","og_description":"In the realm of music, Next Big Sound employs a data-centric approach to artist growth and strategy. Where future value lies for the company is in applying a similar model to the underlying product: music itself.","og_url":"https:\/\/d3.harvard.edu\/platform-rctom\/submission\/could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound\/","og_site_name":"Technology and Operations Management","og_image":[{"width":1038,"height":400,"url":"https:\/\/d3.harvard.edu\/platform-rctom\/wp-content\/uploads\/sites\/4\/2018\/11\/dp-TOM.jpg","type":"image\/jpeg"}],"twitter_card":"summary_large_image","twitter_misc":{"Est. reading time":"5 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/d3.harvard.edu\/platform-rctom\/submission\/could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound\/","url":"https:\/\/d3.harvard.edu\/platform-rctom\/submission\/could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound\/","name":"Could Daft Punk Actually Be Robots? How Machine Learning Could Redefine Musical Creativity at Next Big Sound - Technology and Operations Management","isPartOf":{"@id":"https:\/\/d3.harvard.edu\/platform-rctom\/#website"},"primaryImageOfPage":{"@id":"https:\/\/d3.harvard.edu\/platform-rctom\/submission\/could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound\/#primaryimage"},"image":{"@id":"https:\/\/d3.harvard.edu\/platform-rctom\/submission\/could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound\/#primaryimage"},"thumbnailUrl":"https:\/\/d3.harvard.edu\/platform-rctom\/wp-content\/uploads\/sites\/4\/2018\/11\/dp-TOM.jpg","datePublished":"2018-11-13T22:52:25+00:00","breadcrumb":{"@id":"https:\/\/d3.harvard.edu\/platform-rctom\/submission\/could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/d3.harvard.edu\/platform-rctom\/submission\/could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/d3.harvard.edu\/platform-rctom\/submission\/could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound\/#primaryimage","url":"https:\/\/d3.harvard.edu\/platform-rctom\/wp-content\/uploads\/sites\/4\/2018\/11\/dp-TOM.jpg","contentUrl":"https:\/\/d3.harvard.edu\/platform-rctom\/wp-content\/uploads\/sites\/4\/2018\/11\/dp-TOM.jpg","width":1038,"height":400},{"@type":"BreadcrumbList","@id":"https:\/\/d3.harvard.edu\/platform-rctom\/submission\/could-daft-punk-actually-be-robots-how-machine-learning-could-redefine-musical-creativity-at-next-big-sound\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/d3.harvard.edu\/platform-rctom\/"},{"@type":"ListItem","position":2,"name":"Submissions","item":"https:\/\/d3.harvard.edu\/platform-rctom\/submission\/"},{"@type":"ListItem","position":3,"name":"Could Daft Punk Actually Be Robots? How Machine Learning Could Redefine Musical Creativity at Next Big Sound"}]},{"@type":"WebSite","@id":"https:\/\/d3.harvard.edu\/platform-rctom\/#website","url":"https:\/\/d3.harvard.edu\/platform-rctom\/","name":"Technology and Operations Management","description":"MBA Student Perspectives","potentialAction":[{"@type":"性视界Action","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/d3.harvard.edu\/platform-rctom\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"}]}},"_links":{"self":[{"href":"https:\/\/d3.harvard.edu\/platform-rctom\/wp-json\/wp\/v2\/hck-submission\/34074","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/d3.harvard.edu\/platform-rctom\/wp-json\/wp\/v2\/hck-submission"}],"about":[{"href":"https:\/\/d3.harvard.edu\/platform-rctom\/wp-json\/wp\/v2\/types\/hck-submission"}],"author":[{"embeddable":true,"href":"https:\/\/d3.harvard.edu\/platform-rctom\/wp-json\/wp\/v2\/users\/11787"}],"replies":[{"embeddable":true,"href":"https:\/\/d3.harvard.edu\/platform-rctom\/wp-json\/wp\/v2\/comments?post=34074"}],"version-history":[{"count":0,"href":"https:\/\/d3.harvard.edu\/platform-rctom\/wp-json\/wp\/v2\/hck-submission\/34074\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/d3.harvard.edu\/platform-rctom\/wp-json\/wp\/v2\/media\/34201"}],"wp:attachment":[{"href":"https:\/\/d3.harvard.edu\/platform-rctom\/wp-json\/wp\/v2\/media?parent=34074"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/d3.harvard.edu\/platform-rctom\/wp-json\/wp\/v2\/categories?post=34074"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}