Scientists at Skoltech, Philips Research and Frankfurt University at Gote have trained a neural network to detect abnormalities in medical images to help doctors perform numerous scans to detect pathology. Reported at IEEE AccessThe new method is adapted to the nature of medical imaging and is more effective at detecting abnormalities than the general objective solution.
Identifying image anomalies is a task that comes up in data analysis in many industries. However, medical scans are a special challenge. It is easier for the algorithm to find a car with a flat tire or broken glass in a series of car pictures than to tell which of the X-rays shows the antecedents of lung pathology, such as the onset of covid. -19 Pneumonia.
Skoltech Professor Dmitry Dilov, head of the institute’s computer modeling team and senior author of the study, explains that “medical imaging is difficult for a number of reasons.” “For one thing, the anomalies are very similar to the normal case. Cells are cells, and you usually need a trained professional to recognize that something is wrong.”
“Besides, there is a dearth of examples of anomalies in neural network training,” the researcher added. “Machines are good for a two-class problem, so you have two different classes, each with a lot of examples of training – like cats and dogs. With medical tests, the general condition is always overstated. And they do not have a well – defined class for abnormalities.
To confirm the universality of the method through various imaging techniques, Dillow’s team studied four X-rays of the chest and four histological microscopic images of breast cancer. The advantage was gained and the accuracy varied widely and depending on the relevant dataset, the new methodology continued to outperform the traditional solutions in each case considered. To distinguish the new method from the competitors is to try to “understand” the general impression that an expert working with the scan can have on the features that affect the decisions of the human comment machines.
The study also changes the proposed recipe for standardizing the approach to the problem of identifying medical research anomalies so that different research groups can compare their models in a way that is stable and reproducible.
“We propose to use what is known as poor supervision training,” Delow said. “Since two clearly defined classes are not available, this task is generally treated in an unassisted or distributed mode, i.e., anomalies in the training data are not identified. Because doctors can always point to some unusual examples, so we showed some unusual images to the network that helped unleash the arsenal of poor surveillance methods, and the anomaly scan for the average 200 is a long journey, and this is realistic. . “
According to the authors, it is easy to take their approach to a wider range of other medical scans beyond the two types used in the study, because the solution has adapted to the general nature of such images. That is, it is sensitive to small-scale anomalies and uses several examples of them in practice.
Irina Fedulova, co – author of the study and director of the Philips Research Branch in Moscow, commented: “We are pleased that the Philips-Skoltech partnership is able to meet such challenges that are highly relevant to the healthcare industry. The solution is to significantly speed up the work of histologists, radiologists and other medical professionals who face the arduous task of identification: scanning for basic analysis can remove clearly problem-free images, thereby making the anthropologist more vague. It’s time to dump her and move on. ”
Nina Schwetzow et al. IEEE Access (2021). DOI: 10.1109 / Access. 2021.3107163
Excerpt: Artificial Intelligence Anomalies in Medical Images (October 21, 2021) Retrieved October 22, 2021 from https://techxplore.com/news/2021-10-artificial-intelligence-anomalies-medical-images.html.
This document is subject to copyright. No part of it may be reproduced without written permission, except in fairness for personal study or research. Content is provided for informational purposes only.