CelFDrive: AI-assisted microscopy for automated detection of rare events

Abstract number
155
Presentation Form
Oral
DOI
10.22443/rms.elmi2024.155
Corresponding Email
[email protected]
Session
Session 3 - The AI Revolution: from Image Analysis, to Intelligent Acquisition, to Chatbots Designing your Experiments and Writing your Papers
Authors
Scott Brooks (2), Sara Toral-Perez (2), David Corcoran (2), Karl Kilburn (1), Hella Broughton (1), Nigel Burroughs (2), Andrew McAinsh (2), Till Bretschneider (2)
Affiliations
1. Intelligent Imaging Innovations (3i)
2. University of Warwick
Keywords

AI, Automated microscopy, 3D light-sheet microscopy, Cell division 

Abstract text

Automated microscopy methods are an emerging field which has been facilitated by developments within both microscopy and deep learning. Lattice LightSheet microscopy (LLSM) [1] is a powerful imaging method that offers high spatial and temporal resolution images, whilst minimising photobleaching. Advancements in deep learning and computer vision have facilitated networks that can classify and localise objects in close to real-time [2].

We present CelFDrive, a semi-automated pipeline to improve the speed of image acquisition on the LLSM. A much larger field of view is obtained with an inverted objective, which is analysed by a YOLOv8 [3] network to classify cell state and thus enable identification of cells in the field of view that are about to enter a (rare) state of interest. The cell of interest is centred automatically on the stage and pre-set LLSM image acquisition parameters are imported. From here the user can confirm if they wish to begin imaging the cell of interest.

We have applied this method to predict RPE-1 human retinal cells about to enter mitosis, enabling prophase and prometaphase to be imaged in lattice mode. These cells were stained with the SiR-DNA probe to label chromosomes and visualise them by fluorescence with the inverted objective. Cells classified and their location within the image is converted to a stage location. After centring, 4D time-lapses are then imaged with the LLSM. The software will continuously raster through the coverslip until a cell matching the required criteria specified is found. Cells of interest are found at a much faster rate than a human could with the LLSM, allowing for higher throughput imaging sessions capturing rare events.

To train the network, we provide a graphical user interface to rapidly annotate instances of rare events within a time sequence. CellClicker takes a user drawn bounding box around an image at the end of a biological process and opens the image in a new interface. The user then clicks on the centre of the region of interest as it moves back through the timelapse, rather than drawing boxes for each instance. CellSelector then allows the user to display all their extracted series and for each one select the timepoint within the series they would like to start imaging, such as prometaphase. CellSelector can hold multiple user’s responses which can be aggregated to improve confidence in human annotated data. By providing these interfaces, CelFDrive can easily be retrained and applied to other biological processes.

This project is in collaboration with Intelligent Imaging Innovations (3i), manufacturer of microscopes with integrated software control through SlideBookTM. CelFDrive can be deployed to any 3i LLSM and can be used on other 3i microscope systems. We have developed a fully automated prototype using a 3i Marianas SDC (spinning disk confocal) which in place of switching to the LLSM view, switches from 40x magnification to 100x magnification and performs centring during acquisition to ensure the sample stays within the field of view. In the near future, we aim to give users a fully automated option to capture the cells using the LLSM also.

 

A) Bounding box is drawn by the user and the CellClicker UI is opened with the region of interest (ROI). The user then clicks on the centre of the cell to keep it in view as it moves back though time. CellSelector takes all mini-series generated by cell clicker and allows the user to pick which state they would like to classify. Labels generated are used to train a YOLOv8 model. B) CelFDrive: RPE-1 cells expressing NDC80-EGFP were stained with SiR-DNA to image cell cycle stages (B1), which is input to the trained YOLOv8 model. The classified cell is then centred on the stage for imaging (B2). C) The prometaphase cell is imaged using the LLSM to capture high spatiotemporal resolution time-lapses.

References

1. Chen, B.-C., Legant, W. R., Wang, K., Shao, L., Milkie, D. E., Davidson, M. W., ... & Betzig, E. (2014). Lattice light-sheet microscopy: Imaging molecules to embryos at high spatiotemporal resolution. Science, 346(6208), 1257998. https://doi.org/10.1126/science.1257998

2. Shi, Y., Tabet, J.S., Milkie, D.E., et al. (2024). Smart lattice light-sheet microscopy for imaging rare and complex cellular events. Nature Methods, 21, 301–310. https://doi.org/10.1038/s41592-023-02126-0

3. Jocher, G., Chaurasia, A., & Qiu, J. (2023). Ultralytics YOLO (Version 8.0.0) [Computer software]. https://github.com/ultralytics/ultralytics