

Recent advances in computing power triggered the use of artificial intelligence in image analysis in life sciences. To train these algorithms, a large enough set of certified labeled data is required. The trained neural network is then capable of producing accurate instance segmentation results that will then need to be re-assembled into the original dataset: the entire process requires substantial expertise and time to achieve quantifiable results. To speed-up the process, from cell organelle detection to quantification across electron microscopy modalities, we propose a deep-learning based approach for fast automatic outline segmentation (FAMOUS), that involves organelle detection combined with image morphology, and 3D meshing to automatically segment, visualize and quantify cell organelles within volume electron microscopy datasets. From start to finish, FAMOUS provides full segmentation results within a week on previously unseen datasets. FAMOUS was showcased on a HeLa cell dataset acquired using a focused ion beam scanning electron microscope, and on yeast cells acquired by transmission electron tomography. Research Highlights: Introducing a rapid, multimodal machine-learning workflow for the automatic segmentation of 3D cell organelles. Successfully applied to a variety of volume electron microscopy datasets and cell lines. Outperforming manual segmentation methods in time and accuracy. Enabling high-throughput quantitative cell biology. © 2024 The Authors. Microscopy Research and Technique published by Wiley Periodicals LLC.
| Engineering controlled terms: | Cell cultureCellsDeep learningElectric impedance tomographyElectron microscopesElectronsHigh resolution transmission electron microscopyImage segmentationIon beamsMedical imagingScanning electron microscopyStrontium titanates |
|---|---|
| Engineering uncontrolled terms | Automated segmentationCell biologyCell organelleComputing powerImage-analysisLife-sciencesNeural-networksSegmentation resultsVolume electron microscopyVolume electrons |
| Engineering main heading: | Image analysis |
| EMTREE medical terms: | algorithmartificial neural networkcell organellecytologydeep learningelectron microscopyHeLa cell linehumanimage processingproceduresSaccharomyces cerevisiaethree-dimensional imagingultrastructurevolume electron microscopy |
| MeSH: | AlgorithmsDeep LearningHeLa CellsHumansImage Processing, Computer-AssistedImaging, Three-DimensionalMicroscopy, ElectronNeural Networks, ComputerOrganellesSaccharomyces cerevisiaeVolume Electron Microscopy |
| Funding sponsor | Funding number | Acronym |
|---|---|---|
| Medical Research Council See opportunities by MRC | MRC | |
| Francis Crick Institute See opportunities by FCI | FCI | |
| Wellcome Trust See opportunities by WT | WT | |
| European Cooperation in Science and Technology | CA17121 | COST |
| European Cooperation in Science and Technology | COST | |
| Cancer Research UK See opportunities by CRUK | CC1076 | CRUK |
| Cancer Research UK See opportunities by CRUK | CRUK | |
| Knut och Alice Wallenbergs Stiftelse | 2017.0091 | |
| Knut och Alice Wallenbergs Stiftelse | ||
| Vetenskapsrådet | 2019‐04004 | VR |
| Vetenskapsrådet | VR |
This article is based upon work from COST Action CA17121, supported by COST (European Cooperation in Science and Technology): www.comulis.eu (Walter, Kleywegt, & Verkade, 2021 ). We thank the Electron Microscopy STP at the Francis Crick Institute. The work of Christopher J. Peddie was supported by the Francis Crick Institute, which receives its core funding from Cancer Research UK (CC1076), the UK Medical Research Council (CC1076), and the Wellcome Trust (CC1076). This work was supported by a grant from Knut och Alice Wallenbergs Stiftelse (2017.0091) and Swedish Research Council grant 2019‐04004 to Johanna L. Höög. Open Access funding enabled and organized by Projekt DEAL.
Stojmenović, M.; Department of Computer Science and Electrical Engineering, Singidunum University, Belgrade, Serbia;
Walter, A.; Aalen University of Applied Sciences, Aalen, Germany;
© Copyright 2024 Elsevier B.V., All rights reserved.