MD Anderson cares for more than 100,000 patients each year in Houston, and tens of thousands more through its regional and national networks. The cancer center has lots of data on these patients, but it needs a better way to access that data -- especially the large portion of it that's trapped in notes and other unstructured documents. It will use Watson's natural language processing ability to begin the process of identifying the medical concepts in that unstructured data.
The Watson technology also will play a key role within Apollo, an MD Anderson-designed "adaptive learning environment." Developed for the Moon Shots program, Apollo promotes the interchange of information and learning between the cancer center's research and clinical wings.
Using data from both sides of MD Anderson stored in the institution's data warehouse, Watson will also apply its tremendous processing power to match patients with clinical trials, whether at MD Anderson or elsewhere in the nation.
[ IBM's supercomputer is not above more-mundane tasks. Read IBM Watson Gets Call Center Job. ]
Finally, Watson will integrate this institution's data with evidence from the literature and clinical trials and recommend evidence-based treatment options for particular patients to their physicians. To achieve the last goal, it will use a new analytics tool known as the MD Anderson Oncology Expert Advisor.
This will be the first time doctors are able to use data in such a way to treat cancer patients, according to the press release about the partnership. "The MD Anderson Oncology Expert Advisor is expected to help physicians improve the future care of cancer patients by enabling comparison of patients based on a new range of data-driven attributes, previously unavailable for analysis," said the release.
An example of how care is expected to improve is that the clinical and research sides at Anderson will be able to compare groups of patients to identify those who responded differently to therapies and uncover possible reasons why.
Cancer patients have many different characteristics, including comorbidities, that affect how they respond to different therapies, explained Sean Hogan, VP of healthcare at IBM, in an interview with InformationWeek Healthcare. "This work [with Watson] is helping MD Anderson see how the makeup of each individual correlates with how they progress in their recovery or their disease state," he said.
In addition, he noted, researchers can use Watson to see how pools of patients with shared attributes react to particular treatments. Those cohorts can easily be reconfigured to reflect different variables; for instance, patients with leukemia and diabetes or leukemia and arthritis. This approach strongly resembles one that Intermountain Health Care and Deloitte are taking with their new PopulationMiner analytics tool.
In explaining how IBM researchers compare Watson's logic to the logic of clinicians, Hogan mentioned the analytic tools that IBM Watson recently co-developed with the Cleveland Clinic. WatsonPaths and Watson EMR Assistant, respectively, will help clinicians make better and more informed decisions faster and help them cull valuable insights from their EHRs. However, these tools have not yet been embedded in Watson, said Hogan.
Watson is learning and improving how its core capabilities can be applied in the healthcare context, Hogan said. For example, nine months ago, IBM launched a partnership with Memorial Sloan-Kettering Cancer Institute (MSK) to apply Watson in the fight against cancer; however, the projects with MSK and MD Anderson are being developed independent of each other.
Hogan stressed that Watson will present clinical options to MD Anderson physicians to help support their medical decisions, rather than telling them what to do. These options will be accompanied by the degree of confidence that Watson has in each of them, along with the sources for those conclusions. If a doctor believes Watson is wrong, she can disregard its suggestions. Watson will learn from those clinical decisions, as well as from patient outcomes that are different from those expected.
The centerpiece of the Watson program -- what defines Watson as a "cognitive computer" -- is its ability to do natural language processing, or to parse language and extract meaning from speech or text. This is what allows Watson, for example, to categorize medical concepts that are imbedded in unstructured notes.
Nevertheless, Hogan acknowledged that Watson still has a long way to go in this area, even after spending "thousands of hours" with physicians and learning medical terms. Although he could not say how accurate Watson's understanding of clinical documentation is, he noted that it is constantly improving.
"What has been gratifying to us in the organizations we've been working with is that they've been energized by Watson's potential, not deflated by the problem," he said. "So we're seeing these organizations get excited about the possibilities."