Health

The new AI for mammography scans aims to support human decision-making rather than replace it-ScienceDaily

Computer engineers and radiologists at Duke University have developed an artificial intelligence platform to analyze potentially cancerous lesions with mammography scans to determine if a patient should undergo an invasive biopsy. But unlike many of its predecessors, this algorithm is interpretable. In other words, it shows exactly how the doctor came to the conclusion.

Researchers train AI to identify and evaluate lesions in the same way that real radiologists are trained, rather than allowing AI to freely develop its own procedures. It has several advantages over the “Black Box” counterpart. It could be a useful training platform for teaching students how to read mammography images. It may also help doctors in less populated areas of the world who do not read mammography scans regularly to make better medical decisions.

Results were published online in the journal on December 15th Nature Machine Intelligence.

“If computers help make important medical decisions, doctors need to trust that AI makes conclusions based on something that makes sense,” said Joseph, a professor of radiology at Duke.・ Rho states. “We need an algorithm that not only works, but also explains ourselves and gives examples to support our conclusions. By doing so, AI makes better decisions whether or not doctors agree with the results. It helps. “

Engineering AI that reads medical images is a huge industry. Thousands of independent algorithms already exist, and the FDA has approved more than 100 of them for clinical use. However, whether you read an MRI, CT, or mammogram scan, you rarely use validation datasets that contain more than 1000 images, or even contain demographic information. This lack of information, coupled with the recent failure of some notable examples, has cast many doubts about the use of AI in high stakes medical decisions.

In one example, even if a researcher trained an AI model with images taken from different facilities using different devices, the AI ​​model failed. AI does not focus only on the lesions of interest, but uses the subtle differences introduced by the device itself to recognize images from the cancer ward and have the potential to develop cancer in those lesions. I learned to increase. As expected, the AI ​​wasn’t successfully transferred to another hospital using another device. But no one knew what the algorithm was looking at when making a decision, so no one knew it was destined to fail in a real application.

“Instead, the idea was to build a system that says that this particular part of a potential cancerous lesion is very similar to the other parts I’ve seen before,” Duke’s PhD in Computer Science. Candidate and research. “Without these clear details, practitioners will lose time and confidence in the system if there is no way to understand why it sometimes makes mistakes.”

Cynthia Rudin, a professor of electrical and computer engineering and computer science at Duke University, compares the process of the new AI platform with that of a real estate appraiser. In the black box model that dominates the field, the appraiser offers the price of the house without any explanation. In models that include what is known as a “saliency map,” appraisers may point out that the roof and backyard of a house are important factors in pricing, but do not provide further details. ..

“Our method has a unique copper roof and backyard pool similar to other homes in the neighborhood, and that’s how much the price has gone up,” Rudin said. “This is what the transparency of medical imaging AI looks like, and what people in the medical field should demand for the challenges of radiology.”

Researchers trained the new AI using 1,136 images taken from 484 patients at Duke University Health System.

They first taught the AI ​​to find the suspected lesion in question and ignore all healthy tissue and other irrelevant data. Then hire a radiologist to carefully label the image, focus on the edge of the lesion where the potential tumor meets healthy surrounding tissue, and with the edge of the image with known cancerous and benign consequences. I taught AI to compare those edges.

Radial lines or blurred edges, medically known as mass margins, are the best predictors of cancerous breast tumors and are the first thing radiologists look for. This is because cancer cells replicate and expand so quickly that not all of the tumor edges that are developing in the mammogram are easy to see.

“This is a unique way to train AI in how to read medical images,” says Barnett. “Other AIs aren’t trying to imitate radiologists. They come up with unique ways to answer questions that often rely on useless or even flawed reasoning processes. . “

After the training was completed, the study tested AI. It did not outperform a human radiologist, but it worked like any other black box computer model. If the new AI is wrong, people using it will be able to recognize that it is wrong and why it made a mistake.

Going forward, the team is working to add other physical properties that AI should consider when making decisions, such as lesion shape, which is the second function that radiologists learn to see. Rudin and Lo recently won the Duke MEDx High-Risk High-Impact Award, continued to develop the algorithm, and conducted a radiologist reader survey to see if it would contribute to clinical performance and reliability. ..

Fides Schwartz, a researcher at the Duke Radiology Department, said, “There was a lot of excitement when researchers first started applying AI to medical images. Perhaps a computer sees something or people see it. You will be able to understand something that you couldn’t do. ” “In rare cases, this may not be the case in most scenarios. Therefore, as a human, it is advisable to make sure that the computer understands the information that underlies the decision. . “

This study was supported by the National Institutes of Health / National Cancer Institute (U01-CA214183, U2C-CA233254), MIT Lincoln Laboratory, Duke TRIPODS (CCF-1934964), and the Duke Incubation Foundation.

Back to top button