Case Study

How Docker Accelerates ZEISS Microscopy’s AI Journey

About company: ZEISS is an internationally leading technology enterprise
Industry: Optics and optoelectronics industries
Company size: 42,000+ employees
Location: Headquartered in Oberkochen, Germany; ZEISS is represented in around 50 countries around the world
Product: Docker Desktop

Highlights

Docker’s container solution was a game changer

The ZEISS Research Microscopy Solutions (RMS) team faced challenges in its artificial intelligence (AI) and machine learning (ML) journey. The team crafted a strategy centered on the powerful combination of cloud platforms and Docker’s container technology, marking the beginning of a transformative era. Working with Docker, the ZEISS team took advantage of rapid deployment and scaling, GPU accessibility, cross-platform portability, resource efficiency, version control and reusability, isolation and security, developer familiarity with Docker solutions, and the vast Docker ecosystem.

Introduction

International optics leader leverages containerized AI

For more than 175 years, ZEISS has been shaping technological progress, advancing the world of optics with solutions from its four segments — Semiconductor Manufacturing Technology, Industrial Quality & Research, Medical Technology, and Consumer Markets — and meeting its customers’ needs.

The ZEISS Research Microscopy Solutions (RMS) team uses containers to deploy AI models and code across various platforms. For example, ZEISS leverages its cloud-based arivis AI platform to annotate multi-dimensional microscopy image data and train AI models for segmentation. Those models — including all the code required to run them — are deployed via containers with application-specific interfaces for different clients.

Company profile

About ZEISS

ZEISS is an internationally leading technology enterprise in the optics and optoelectronics industries. The ZEISS Group has more than 42,000+ employees across six continents with more than €10 billion in annual revenue.

Challenges

ZEISS Research Microscopy Solutions (RMS) need for computer vision across platforms

As a leading manufacturer of microscopes, ZEISS offers solutions and services for life sciences and materials research, teaching, and clinical routine. Reliable ZEISS systems are used for manufacturing and assembly in high-tech industries and the exploration and processing of raw materials worldwide and for life science research and basic research. Because modern microscopy is all about “actionable information” extracted from large, multi-dimensional datasets, powerful computer vision and processing methods are an integral component of the RMS software portfolio. In particular, the need for robust and powerful segmentation methods using state-of-the-art AI models is a common use case.

The R&D team at the Product Center for Software inside RMS faced challenges in its artificial intelligence (AI) and machine learning (ML) journey. They needed to execute AI models trained on their cloud platform also in local Windows-based clients such as ZEN, ZEN core, and Arvis Pro, including GPU support. They wanted an efficient way to distribute these AI models and update them independently of the client code. The AI models needed consistent results across different platforms, regardless of where they ran.

The RMS product teams also wanted to shield clients from the complexities of the model internals. These application-specific internals included various ML technologies, data handling methods, pre-and post-processing routines, and APIs. Their primary goal was straightforward: run AI models and code consistently and distribute code with diverse dependencies smoothly.

Integrating AI functionality into the ZEISS team’s client systems presented a challenge. To achieve the same results when running AI models on a client, the local environment needs to be identical to the environment used during the training on the cloud platform. In other words, the challenge is to keep both worlds in sync. Containers are the solution to this challenge.

Additionally, the containerized AI algorithms require access to GPU resources from a Linux -based container on a Windows host system, which added another layer of complexity to their challenges.

Solution

Docker helps put a lens on ZEISS AI solutions

Containers have become a mature and standard deployment technology. Always at innovation’s cutting edge, the ZEISS Research Microscopy Solutions team saw the clear benefits of containers. These containers deploy seamlessly in any environment, are simple to manage, scale, and patch, and are the preferred technology for their machine-learning distribution needs. Their compatibility with a wide range of tools and frameworks boosts their attractiveness. The team’s hands-on work with their training data platform’s cloud modules, which use containers, gave them a deep understanding of the technology’s potential.

Facing these challenges, the ZEISS team crafted a strategy centered on the powerful combination of cloud platforms and container technology. They centralized annotations and training on their cloud platform. The ZEISS team used AzureML to execute and oversee this training, simplifying the training process and automatically creating and registering the AiModelContainer. They designed this container to house the model, all vital libraries, and the necessary code for data I/O, pre- and post-processing, tiling, and model inference.

The team set up a secure connection to connect the cloud and local environments. This connection lets the local client download the needed image directly to the local client system, preserving the data’s integrity and security. ZEISS entrusted the ContainerApps library with managing these container images. This library took on tasks from starting and stopping containers to mounting the GPU, ensuring peak performance.

Communication between the AiModelContainer and local clients became a solution’s cornerstone. The ZEISS team rolled out a well-structured and versioned REST-API interface, guaranteeing smooth, efficient, and mistake-free communication. This focused strategy tackled the immediate challenges and set the stage for future scalable and lasting AI integrations. Using containers with a well-defined interface allows decoupling the development of new AI methods from the release cycle of the main SW platform (separation of concerns). The AI code can evolve without impacting existing client applications, thereby reducing cross-dependencies between components.

Key benefits

Speed, security, and choice

Working with Docker yielded numerous benefits for ZEISS:

Icon 02 keybenefits

Rapid deployment and scaling

The ability to create containers swiftly aids in on-demand scaling and supports CI/CD practices.

Icon 04 keybenefits

GPU accessibility

The unique ability to access the GPU from Linux-based containers, even when running on a Windows host, enhances computational capabilities.

Icon 03 keybenefits

Cross-platform portability

Docker's design allows users to run applications on Windows and supports both Linux and Windows-based containers.

Icon 07 keybenefits

Resource efficiency

Reduced overhead due to shared operating system kernel leads to efficient resource utilization.

Icon 05 keybenefits

Version control and reusability

Docker's versioning and image reusability ensure consistent deployments across environments.

Icon 06 keybenefits

Isolation and security

The encapsulation provided by Docker containers ensures application security and isolation from external interferences.

Icon 07 keybenefits

Ecosystem and community support

The vast Docker ecosystem and active community provide tools, improvements, and knowledge support.

Icon 06 keybenefits

Familiarity and reduced learning curve

The ZEISS team's prior experience with Docker, an industry de facto standard for containerization, ensured a smoother integration process.

Reproducibility

Reproducibility

The capability to run a model using the same code in the same environment ensures reproducible and consistent results across platforms.

 

Together, these factors solidified Docker as the optimal solution for the ZEISS team, empowering them to amplify performance, guarantee reproducibility, and streamline their software architecture while working securely.

Results

Docker helps ZEISS see a clear solution in containerization for AI/ML

The ZEISS Research Microscopy Solutions team adopted Docker’s solution, marking the beginning of a transformative era. This adoption improved consistency in AI model outcomes and streamlined the deployment process, harmonizing the relationship between models and their dependencies. Using the “containers and code” approach for deployment lets the team get the same results in all client applications because they all use identical environments and tools to run the models inside the containers. This greatly reduced the need for code duplication, letting the development teams focus on creating new features instead of keeping code in sync across platforms. The ZEISS team didn’t just address the challenges of AI model deployment and execution — they also boosted their operational efficiency and value delivery.

As the ZEISS team embarked on this journey, they faced a clear set of challenges. Docker’s container technology offered a compelling solution to these challenges. With a clear vision and the evident benefits of Docker, ZEISS easily gained support and buy-in from key stakeholders.

They shifted to package AI models with their dependencies, renewing their focus on the interface design between containers and clients. This strategic decision refined the internal architecture and clarified team roles. A significant benefit of this strategy was the autonomy it provided to teams. Teams could now innovate on AI methods and integrate them into clients independently. This clear division of responsibilities streamlined workflows, reduced alignment efforts, and paved the way for greater efficiency and innovation at ZEISS.

"At ZEISS, we've always been at the forefront of technological innovation and supporting reproducible research. With Docker, we can now evolve and deploy our AI models across our platforms, ensuring consistent and reliable results every time.”

Dr. Sebastian Rhode
Software Architect – AI Solutions, Staff Expert at ZEISS Microscopy

"The challenges of deploying AI models, especially in diverse environments, are numerous. Docker's container solution has been a game-changer for us, allowing for increased modularity and efficiency when developing new products."

Dr. Sebastian Rhode
Software Architect – AI Solutions, Staff Expert at ZEISS Microscopy

"In the ever-evolving realm of AI, consistency is key. With Docker, we've achieved a level of reproducibility that ensures our AI models perform optimally, irrespective of where they are deployed."

Dr. Sebastian Rhode
Software Architect – AI Solutions, Staff Expert at ZEISS Microscopy

"Docker's container technology has provided ZEISS with the perfect toolkit to address our unique challenges in AI model deployment. It's not just a solution; it's the right solution for us."

Dr. Sebastian Rhode
Software Architect – AI Solutions, Staff Expert at ZEISS Microscopy

"What sets Docker apart is its alignment with the OCI standard and its incredible versatility. It's not just about running containers; it's about running them in a way that can be easily adapted and scaled, which is crucial for our software development approach at ZEISS."

Dr. Sebastian Rhode
Software Architect – AI Solutions, Staff Expert at ZEISS Microscopy

"The ability to encapsulate all runtime dependencies in a Docker container has been a game-changer for us. It has streamlined our deployment process, allowing us to focus more on innovation than troubleshooting compatibility issues."

Dr. Sebastian Rhode
Software Architect – AI Solutions, Staff Expert at ZEISS Microscopy

"Docker's GPU support has been invaluable for our AI models, especially when running Linux-based containers on a Windows host. These nuanced features make Docker the right fit for ZEISS's complex needs."

Dr. Sebastian Rhode
Software Architect – AI Solutions, Staff Expert at ZEISS Microscopy

"The ease of use and extensive ecosystem around Docker has accelerated our AI initiatives. It's not just a containerization tool; it's a comprehensive solution that aligns perfectly with ZEISS's goals and challenges."

Dr. Sebastian Rhode
Software Architect – AI Solutions, Staff Expert at ZEISS Microscopy

“With Docker, we can now evolve and deploy our AI models across our platforms, ensuring consistent and reliable results every time.”

Dr. Sebastian Rhode
Software Architect – AI Solutions, Staff Expert at ZEISS Microscopy

Find a subscription that’s right for you

Contact an expert today to find the perfect balance of collaboration, security, and support with a Docker subscription.