Imaging techniques enable a detailed look inside an organism. But interpreting the data is time-consuming and requires a great deal of experience. Artificial neural networks open up new possibilities: They require just seconds to interpret whole-body scans of mice and to segment and depict the organs in colors, instead of in various shades of gray. This facilitates the analysis considerably. An interdisciplinary research team has now developed self-learning algorithms to in future help analyze bioscientific image data.
In the AIMOS project, the algorithms were trained with the help of images of mice. At the core of the AIMOS software – the abbreviation stands for AI-based Mouse Organ Segmentation – are artificial neural networks that, like the human brain, are capable of learning. The research was conducted at TranslaTUM, the Center for Translational Cancer Research, at the TUM. The institute is part of the TUM University Hospital rechts der Isar and specializes in transferring cancer research insights to practical patient services through interdisciplinary cooperation. When using the novel 3D microscopy, the scientists at the TUM worked closely with experts at the Helmholtz Zentrum München.
Prof. Bjoern Menze (Chair for Computer Aided Medical Procedures & Augmented Reality), head of the Image-Based Biomedical Modeling group at TranslaTUM at the TUM