From Content-Aware Denoising to Semantic Unmixing of Microscopy Data

Abstract number
83
Presentation Form
Invited
Corresponding Email
[email protected]
Session
Session 3 - The AI Revolution: from Image Analysis, to Intelligent Acquisition, to Chatbots Designing your Experiments and Writing your Papers
Authors
Florian Jug (1)
Affiliations
1. Fondazione Human Technopole
Keywords

Denoising, Unmixing, Deep Learning, Microscopy, FAIR

Abstract text

The necessity to analyze scientific images is as old as the ability to acquire such data. While this analysis did initially happen by observation only, modern microscopy techniques enable us to image at unprecedented spatial and temporal resolutions, through the 'eyes' of many and very diverse imaging modalities.

The unfathomable amounts of data acquired in the context of life science research cannot any longer be analyzed by manual observation alone. Instead, algorithmic solutions are helping researchers to study and quantify scientific image data.

In the past years, our abilities to use artificial intelligence (AI) for the automated analysis of scientific image data gained significant traction, and many important analysis problems have now much improved solutions based on ANNs. At the same time, we start being aware of limitations that come with this new set of machine learning approaches. To overcome those, it will require a community effort, and I will be happy to make this case in my talk.

Additionally, I would like to give an update on some of our latest algorithmic developments, i.e. the semantic unmixing of superimposed structures in fluorescence microscopy data. Finally, I will also talk about AI4Life, a FAIR and easy to use infrastructure to store, share, and run AI based methods stored on the bioimage.io (the BioImage Model Zoo).