Acquiring images using modern techniques such as light sheet fluorescence, confocal, or electron microscopy creates a significant data stream. Add modalities like multichannel, 3-D, and time-lapse, and managing the data sets generated soon becomes a serious issue. Researchers therefore need more efficient solutions for data storage and processing in terms of both computer hardware and software. In addition, changing workflows to incorporate these massive data requirements means substantial adaptation of training schemes and education. In this webinar, the panelists will discuss how these challenges can be addressed on both a conceptual level and in day-to-day research in the lab. They will outline systematic approaches to illustrate how microscope users can get greater benefit and more consistent results from big image data experiments, and will present examples of successful workflows. The webinar will be of interest to scientists in any research area using microscopy or analyzing big image data to gain information from multidimensional experiments. These areas include developmental research, neurobiology and cell biology, high-content screening, medical imaging, and materials science, among others.
During this webinar, the panelists will answer the live audience’s questions and discuss:
- Image data heterogeneity and how to maintain compatibility between microscopy modalities (metadata, file formats, and open data interfaces)
- Image processing based on computer clusters and worldwide networks
- Strategies for automated data handling, processing, and storage workflows in a microscopy facility environment (for example, for automated whole slide imaging).
The webinar will last approximately 60 minutes.
To learn more about products or technologies related to this webinar, go to: www.zeiss.com/big-image-data