From gaming to healthcare: Microsoft鈥檚 approach to visualizing cancer

Can machine learning help transform radiation therapy for cancer?

The notion that the same underlying technology that currently helps you master the Boot Scootin鈥 Boogie line dance on Country Dance All Stars for Xbox 360 could soon help your doctor develop your cancer treatment plan sounds absurd, but that鈥檚 exactly what Microsoft is attempting to do with Project InnerEye [1].

For over four decades, Microsoft has been at the forefront of consumer technological innovation and has launched a host of products that have dramatically transformed various aspects of human interaction. From MS-DOS and Windows altering how humans interact with machines; to Microsoft Office changing how people interact with their colleagues, clients, and customers; to Xbox Kinect transforming how gamers interact with their consoles and, by extension, their friends [2]. But the tech giant now has its sights set on disrupting yet another interaction: the delicate relationship between physicians and their patients. Microsoft鈥檚 Project InnerEye is leveraging machine learning and computer vision technology to develop tools to aid radiation oncologists in identifying, targeting, and monitoring cancer in their patients. The question remains, however, whether the team will be able to successfully apply their deep expertise in AI research to the complex, slow-moving, and heavily regulated world of healthcare [3].

Unlike the Kinect, which captures a gamer鈥檚 movements by looking at the outside of their body, Project InnerEye is doing exactly what its name suggests: looking inside a patient鈥檚 body. However, the software and hardware innovations that enable both pieces of technology are the same: (1) a supervised machine learning algorithm developed by Microsoft called Random Forest and (2) Field Programmable Gate Array (FPGA) supercomputers that enable both computational speed and programming flexibility and which are based upon Graphical Processing Units (GPUs) initially developed for gaming applications [4]. With Project InnerEye and other healthcare-focused programs that are currently being pursued under Microsoft鈥檚 Healthcare NExT initiative, the company is trying to marry its history of technological prowess with recent advances in machine learning and cloud computing to compete in the deeply challenging but meaningful space of healthcare innovation [5].

Project InnerEye鈥檚 approach to machine learning is one that its researchers call 鈥淎ssisted AI,鈥 which focuses on developing tools that supplement rather than substitute the expertise of the human physicians that have traditionally been tasked with performing the work Project InnerEye is automating [6]. Without the assistance of machine learning, radiation oncologists are forced to spend hours per patient manually analyzing a set of diagnostic images slice by slice in order to identify and delineate cancerous vs. healthy tissue. Only after carefully performing this time-consuming and highly specialized task can he or she begin to develop the radiotherapy treatment plan. Project InnerEye promises to offload this particular task from the physician in order to enable him or her to focus on other equally critical but less tedious tasks. With InnerEye, a physician simply sends anonymous and encrypted scan images directly to the program, which is then able to complete the markup and develop a 3D model of the tissue in a matter of minutes [7]. To get to this point, the team at Project InnerEye trained its machine learning algorithm using scans from hundreds of patients that represented a wide range of hospital geographies, imaging modalities (e.g. MRI vs. CT vs. PET), and image resolution and then validated its accuracy by comparing the results to those of resident experts [8].

Looking forward to the future, in addition to offloading tasks currently performed by physicians, the Project InnerEye technology could be used to alter the treatment paradigm for cancer patients. Whereas today imaging and markup is generally only performed once due to the time and expense involved, Project InnerEye could help usher in an era of 鈥渁daptive radiotherapy鈥 in which these tasks are performed more regularly during a treatment series to actively monitor the tumor鈥檚 progression and more precisely target the therapy [7]. Interestingly, Project InnerEye will likely face some challenges with adoption that other AI-enabled technology has not. Whereas AI and automation have already started to dramatically alter the workforce for many low-skilled jobs that managers have determined machines can perform more cost effectively than human labor (e.g. cashiers, call centers, etc鈥), Project InnerEye鈥檚 targeted end-users are the very same people whose work it intends to perform. This reality requires the team to carefully thread the needle of providing valuable automation without wholly replacing the role of the physician in the process, which requires close-knit and interactive relationships with the physician community to understand their needs and motivations.

While Project InnerEye has shown impressive potential, many questions remain regarding its future adoption. Will physicians ever fully trust a machine to provide patient-critical health information? If the machine learning algorithm makes a mistake that results in a serious adverse event for a patient, who is to blame 鈥 Microsoft or the treating physician?

 

(Word Count: 798)

 

[1] Game Mill Entertainment, 鈥淐ountry Dance All Stars,鈥 , accessed November 2018.

[2] Malathi Nayak, 鈥淭imeline: Microsoft鈥檚 journey: four decades, three CEOs,鈥 Reuters Business News, February 4, 2014, , accessed November 2018.

[3] Allison Linn, 鈥淢icrosoft looks to healthcare partners for ways to bring AI benefits to cancer patients,鈥 The AI Blog, Microsoft, November 28, 2017, , accessed November 2018.

[4] Gerald Lynch, 鈥淔rom Kinect to InnerEye 鈥 How Microsoft is supercharging gaming tech with AI smarts to help diagnose cancer,鈥 Tech Radar, May 9, 2017, , accessed November 2018.

[5] Tom Warren, 鈥淢icrosoft Healthcare is a new effort to push doctors to the cloud,鈥 The Verge, June 27, 2018, , accessed November 2018.

[6] Microsoft Research, 鈥淔ive-minute overview of the InnerEye research project ,鈥 YouTube, published September 22, 2016, , accessed November 2018.

[7] Ian Sample, 鈥溾橧t鈥檚 going to create a revolution鈥: how AI is transforming the NHS,鈥 The Guardian, July 4, 2018, , accessed November 2018.

[8] Cynthia E. Keen, 鈥淎I drives analysis of medical images,鈥 Healthcare-In-Europe.com, February 27, 2018, , accessed November 2018.

[9] Ian Sample, 鈥淛oseph Stiglitz on artificial intelligence: 鈥榃e鈥檙e going towards a more divided society,鈥欌 The Guardian, September 28, 2018, , accessed November 2018.

Previous:

Machine Learning in the Shale Patch: A Look at the 鈥淢other Fracker鈥

Next:

Beauty in the Age of Individualism: Sephora鈥檚 Data-Driven Approach

Student comments on From gaming to healthcare: Microsoft鈥檚 approach to visualizing cancer

  1. Such a great piece.

    Project InnerEye clearly stands to deliver huge benefits to patients and doctors, not only by speeding up diagnoses, but saving costs, too, and both points are well-researched and articulated here.

    What I’m left wondering is a question common to any application of AI to matters of great moral or ethical significance: even if InnerEye achieves near perfect accuracy, will doctors and patients ever be comfortable relying on it when a life is on the line? Will doctors have to double-check its analysis, and if so, how does that limit the efficiencies it delivers?

Leave a comment