LISTEN LIVE

Tampa VA Doctors Use Artificial Intelligence To Diagnose Cancer

Feb 6, 2020

Researchers at the Tampa veterans’ hospital are training computers how to diagnose cancer. It’s one example of how the Department of Veterans Affairs is expanding artificial intelligence development.

  

Inside a laboratory at the James A. Haley Veterans’ Hospital in Tampa, machines are rapidly processing tubes of patients' body fluids and tissue samples. Pathologists examine those samples under microscopes to spot diseases like cancer.

But distinguishing certain features about a cancer cell, which can drastically affect treatment, can be difficult, so Drs. Stephen Mastorides and Andrew Borkowski, decided to get a computer involved.

They recently published a study comparing how different AI programs performed when training computers to diagnose cancer.

In a series of experiments, they uploaded hundreds of images of slides containing lung and colon tissues into artificial intelligence software.

Some tissues were healthy and others had different types of cancer, including squamous cell and adenocarcinoma. They chose those organs because lung and colon cancers are some of the most commonly diagnosed cancers among VA patients.

Then they tested software with more images the computer had never seen before.

"The module was able to put it together and it was able to differentiate is it a cancer or is it not a cancer, and not only that, but it was also able to say what kind of cancer is it,” said Borkowski, chief of the hospital's molecular diagnostics section.

The doctors were harnessing the power of what's known as machine learning. Software pre-trained with millions of images like dogs and trees can learn to distinguish new ones. Dr. Mastorides, chief of pathology and labratory medicine services, said it only took minutes to teach the computer what cancerous tissue looks like.

"Our earliest studies showed accuracies over 95%" he said.

The AI software was able to differentiate between benign colon tissue (left) and cancerous colon tissue. Beyond that, it was able to determine what kind of colon cancer it was, in the case of the image on the right, adenocarcinoma. COURTESY OF DR. ANDREW BORKOWSKI/JAMES A. HALEY VETERANS' HOSPITAL

Enhance, not replace

The doctors say this kind of technology could be a vital asset to rural veterans’ clinics, where pathologists and other specialists aren't easily accessible, or in crowded VA emergency rooms, where being able to spot something like a brain hemorrhage in a radiology scan faster could save more lives.

Borkowski said he sees AI as a tool to help doctors work more efficiently, not to put them out of a job.

"It won't replace the doctors, but the doctors who use AI will replace the doctors that don't," he said.

The Tampa pathologists aren't the first to experiment with machine learning in this way. Of the thousands of AI tools out there, the U.S. Food and Drug Administration has approved about 40 algorithms for medicine, including apps that predict blood sugar changes and help detect strokes in CT scans.

The VA already uses AI in several ways such as scanning medical records for signs of suicide risks. Now the agency is looking to expand research into the popular technology.

The department announced the hiring of Gil Alterovitz as its first-ever Artificial Intelligence Director in July of 2019 and launched a The National Artificial Intelligence Institute in November. Alterovitz is a Harvard Medical School professor and co-wrote an artificial intelligence plan for the White House last year.

He said the VA has a “unique opportunity to help veterans” with artificial intelligence.

As the largest integrated health care system in the country, the VA has vast amounts of patient data, which is helpful when training AI software to recognize patterns and trends. For example, Alterovitz said the health system generates about a billion medical images a year.

He described a potential future where AI could help combine the efforts of various specialists to improve diagnoses.

“So you might have one site where a pathologist is looking at slides, and then a radiologist is analyzing MRI and other scans that look at a different level of the body, with AI you could have an AI orchestrator putting together different pieces and making potential recommendations that teams of doctors can look at,” he said.

Gil Alterovitz will oversee the VA's efforts to expand AI research and development. CREDIT: ROBERT TURTIL/DEPARTMENT OF VETERANS AFFAIRS

Alterovitz is also looking for other uses to help VA staff make better use of their time and help patients in areas where resources are limited.

“So being able to cut the (clinician) workload down is one way to do that, other ways are working on processes so reducing patient wait times, analyzing paperwork, etc.” he said.

The National Artificial Intelligence Institute is still in the planning stage, but Alterovitz said it will have a couple focus areas.

“One is the flagship projects, or the new initiatives it (the institute) will design and lead, and then many of the projects will be affiliated projects, where we can serve as an affiliation hub for a number of different projects and provide help where needed or connect different pieces together,” he said.

The institute will also seek to collaborate with the private sector through tech sprints, which are time-limited engagements that encourage innovation, as well as educate more VA staff about AI. 

Barriers to AI

But Alterovitz notes there are challenges to implementing AI, like privacy concerns and trying to understand how and why AI systems make decisions.

Last year Google's AI company DeepMind used VA data to test a system that could predict deadly kidney disease in patients up to two days in advance. But for every correct prediction there were two false positives.

Many of the false alerts were attributed to reports of past kidney issues in patients’ electronic health records, which wouldn’t necessarily mean they were about to experience sudden kidney failure.

False positives may cause doctors to run unnecessary tests or do other things that could waste time and reduce their confidence in the technology, said Mildred Cho, Associate Director of the Stanford Center for Biomedical Ethics. But she said untrustworthy AI systems could also affect treatment, which could cause harm.

“It's important for AI systems to be tested in real-world environments with real-world patients and clinicians, because there can be unintended consequences," she said.

The DeepMind study also acknowledged more than 90 percent of the patients in the dataset it used to test the system were male veterans, and that performance was lower for females.

Cho said it’s important to test AI systems with a variety of demographics, because what may work for one population may not for another.

Gil Alterovitz said the VA is taking all of those concerns into account as the agency experiments with AI and tries to improve upon the technology to ensure it is reliable and effective. 

VA doctors say they don't want to rush things when it comes to AI. But they say the work they're doing now will sift through the hype and lead to more practical use in the near future.

This story was produced by the American Homefront Project, a public media collaboration that reports on American military life and veterans. Funding comes from the Corporation for Public Broadcasting.