Case Study

How Ingka Group uses MLOps with Docker and Kubernetes for AI/ML Deployment Efficiency

Industry: Home Furnishings and Retail
Company size: 231,000
Location: Global, with more than 482 stores and planning studios across 31 countries. Headquarters in Leiden, Netherlands

Highlights

  • MLOps team scales AI/ML efficiency with Docker and Kubernetes.
  • A unified platform makes collaboration among Ingka Group’s AI/ML teams more consistent and efficient.
  • Noticeable advancements in development speed, scalability, and security.

Introduction

The Ingka Group, operating IKEA retail stores, is known for its commitment to sustainability, digital innovation, and community engagement in the home furnishing sector. Because the Ingka Group serves billions of customers across 31 countries, operational efficiency and technological advancement are paramount to its mission.

Beyond the storefront, the company’s engineering teams develop and maintain the technology supporting the entire business. The designated MLOps team builds and manages Ingka Group’s internal development ecosystem, ensuring teams can deploy and maintain AI/ML (Artificial Intelligence/Machine Learning) systems in production reliably and efficiently.

Ingka Group has embarked on an ambitious journey to harness the potential of AI/ML to enhance operational efficiency and customer experience, with efforts ranging from demand forecasting to lifelike design apps for consumers. The company’s proactive stance on ethical AI usage and its commitment to responsible innovation underscores its dedication to advancing retail with integrity and foresight.

Tasked with making AI/ML development easier, Ingka Group’s MLOps team of 14 has a structured yet flexible approach, providing both automation and personalized support. Yasen Faizan Shaik Mohammed, Tech Lead for the MLOps Platform, explains the team’s service model for internal developers, saying, “You concentrate on building your model. We will make sure that it is up and running all the time.”

Challenges

Ingka Group’s AI/ML journey came with its own set of challenges in the MLOps landscape. Seeing the potential of AI/ML, the MLOps team began their strategy. “Everything started with identifying the current challenges and focusing on solving them,” Soufiane Benzaoui, Consultant Machine Learning Engineer, explains, underlining the initial steps toward streamlining AI/ML integration. In a sector where trends shift swiftly, any delay from idea to deployment means potential market advantages could slip away. Making daily development faster is likely to have multiplying effects on the organization.

Balancing modularity and integration

The team discovered different practices and tools across various groups. Benzaoui says that complex environments need “components working solely but also connecting well with each other.” Without modularity, quality suffers due to interference during model training, evaluation, and deployment. At the same time, components must integrate and communicate to form a cohesive ML pipeline.

Securing data in global AI operations

Ingka Group’s global presence and the vast amount of customer data it processes necessitate a high standard of security. Models must be trained without compromising data privacy, with particular data access, processing, and storage requirements. Deployed models must be secure from external tampering. Dependencies must be free from serious vulnerabilities.

Sharing and reproducing work

Different teams used different tools and practices, so developers faced inconsistent development environments. Sharing work and replicating results was difficult, with developers spending excessive time troubleshooting. The company recognized that these challenges risked slowing down AI/ML development and began looking for solutions.

Solution

Ingka Group created a specialized MLOps team to tackle these challenges, enabling teams with dependable tools, resources, and mentorship. The MLOps team recognized the importance of a unified platform for managing large-scale AI/ML projects smoothly, and chose Docker to organize software in containers and Kubernetes to coordinate those containers. The team incorporates essential tools like MLFlow to keep track of experiments, and Seldon Core to put models and other customized tools into use to fit Ingka Group’s unique needs.

Benzaoui emphasizes the importance of the role of MLOps. “The struggle now with data science or machine learning is acknowledging the necessity of embracing MLOps. Along with all the required skills and tools and cultural adjustments,” Benzaoui says. The rollout of Ingka Group’s MLOps platform was planned to boost their AI/ML capabilities through both tools and processes.

Assessment and planning

The initial phase involved a thorough assessment of existing AI/ML workflows, tools, and infrastructure. Ingka Group identified key areas needing improvement, such as development cycles, security, and standardization across teams. Strategic planning sessions were held to outline the goals and scope of the MLOps platform, ensuring alignment with Ingka Group’s broader business objectives and technological landscape.

A platform that scales

Choosing the right tech was key for the platform. Ingka Group chose Docker and Kubernetes because they’re scalable, flexible, and well-regarded. Docker made sure AI/ML work was consistent across all stages from development to production, and Kubernetes enabled easy management and deployment of these Docker containers.

After choosing these technologies, the MLOps team started on the platform design. They aimed for a system that was flexible and secure and could handle the various required AI/ML tasks. They included important features, including environments for developing and experimentation, pipelines for rolling out models and scheduling workloads, and tools for keeping track of performance and managing deployments.

Ingka Group wanted to make team environments more consistent to boost efficiency and teamwork. The MLOps platform provided a common set of tools and methods, ironing out differences in how teams worked, and leading to smoother AI/ML project workflows.

Security, legal, and ethical standards

Ingka Group took data privacy and security seriously in designing their platform. They ensured data was encrypted when being sent and stored, set up access controls based on user roles, and followed global data protection laws. Regular checks and assessments were also set up to keep the platform safe and secure. With so much customer data to handle, Ingka Group made these strict security steps a top priority. This ensured that all AI/ML activities met both legal and ethical standards.

Onboarding teams

Ingka Group also ensured its teams could fully use the platform, rolling out extensive training for developers and data scientists. This approach helped the teams understand the platform’s features and the best practices in AI/ML development. They planned for a slow move of existing projects to the new platform, offering plenty of support along the way. This training and support aimed to make sure the teams were confident and skilled, ready to get the most out of the platform.

Iterative enhancement

Recognizing that the MLOps platform would evolve, Ingka Group adopted an iterative approach to its development and enhancement. Feedback mechanisms were established to gather insights from users, allowing for continuous improvement of the platform’s features and usability. This approach ensured that the platform remained responsive to the needs of their AI/ML projects and the evolving technological landscape. The MLOps platform was designed with an iterative enhancement mindset, allowing Ingka Group to continuously refine and expand its features based on user feedback and changing needs. This approach ensures the platform remains responsive and adaptable to the evolving AI/ML landscape.

Throughout the implementation, Ingka Group maintained a focus on collaboration, efficiency, and innovation. By methodically addressing each aspect of the MLOps platform’s development and deployment, Ingka Group positioned itself to leverage AI/ML technologies more effectively, driving forward its goals of enhancing customer experience and operational excellence.

Key benefits

Ingka Group’s adoption of Docker and its development of an MLOps platform have led to significant improvements across several key areas:

Icon 09 keybenefits

Streamlined development

With MLOps development based on Docker, sharing complex resources is easy, speeding up the time it takes to go from concept to deployment compared to developing with less portable technologies.

Icon 13 keybenefits

Easier collaboration

With uniform development environments across teams, teams can share models with more consistency during all stages of the development process.

Icon 12 keybenefits

Improved security

Docker's containerization offers robust isolation and security features, protecting Ingka Group's data and AI/ML models throughout the development and deployment processes, essential for maintaining customer trust and compliance.

Icon 08 keybenefits

Scalability and flexibility

With Docker, Ingka Group can dynamically scale its AI/ML applications to meet fluctuating demands, ensuring resource efficiency and optimal performance without significant overhead costs.

Icon 10 keybenefits

Continuous improvement

Docker supports Ingka Group's iterative approach to AI/ML development, enabling quick updates, easy incorporation of feedback, and continuous refinement of models and applications to meet evolving customer needs and market trends.

 

Results

The strategic implementation of Docker and Kubernetes within the MLOps platform established a new benchmark for AI/ML deployments at Ingka Group. “We wanted a platform that provides security, observability, scalability, and ease of use. With a more traditional approach, that would be extra-challenging, but Kubernetes and Docker provided the tooling to make all of those pillars easier to have,” Benzaoui says. Although iterative improvement is ongoing, the MLOps platform has already proven valuable for Ingka Group’s AI/ML development.

Flexible, fast, and collaborative cycles

Ingka Group chose technology stacks based on containerized development to get the best possible speed in their AI/ML development cycles, from initial prototyping to deployment. Newer tools are optimized for speed and provide more flexibility for sharing. “Docker fosters collaboration both within and across teams,” Benzaoui says. By containerizing their AI/ML applications, Ingka Group could quickly prototype, test, and deploy new models. Team members could more easily share updated or more performant resources.

The user-friendly nature of Docker also encouraged widespread adoption across various departments, encouraging approachable use of AI/ML within the organization.

Benzaoui notes that although exact numbers are hard to pinpoint, there’s a noticeable reduction in the time to market for applications developed with the platform. “With the current users, we are already seeing a better approach and also reduced time to market for their applications,” Benzaoui says. Performance improvements also grant teams options for scaling model complexity. “It affects the development cycle from end to end, the models are now scalable, secure, and observable.”

Uniform and secure development environments

Ingka Group’s MLOps platform serves as a unifying framework, providing a consistent set of tools and practices for all developers. The platform has standardized the development environment across different projects and teams, ensuring that components can work independently yet seamlessly integrate when necessary. Consistency across development, testing, and production improved both the quality of model training and the efficiency of the deployment process.

Ingka Group’s focus on security, bolstered by Docker and Kubernetes, has strengthened its overall security posture. “Kubernetes on Docker provides isolation, which is very important,” says Benzaoui says. This isolation minimizes risks of cross-application interference and potential breaches.

Secure development environments are further supported by Docker Trusted Content and Docker Scout which improve the overall security posture of their AI/ML development process. Docker Scout analyzes image layers and provides remediation suggestions, often recommending a more secure or up-to-date base image from Docker Trusted Content.

Benzaoui emphasizes the importance of investing in the right foundation, saying, “You may take more time at the beginning, but you’re going to see the gain in the long term. Long term you see the gain where your model will be more reliable, more observable, and if there’s an issue, you can fix it quickly.”

“You concentrate on your model... we will make sure that your model is up and running all the time."

Yazin Faizan Shaik Mohammed
Tech Lead MLOps Platform, Ingka Group

“Technologies are there to bring solutions that make life of other people easier."

Yazin Faizan Shaik Mohammed
Tech Lead MLOps Platform, Ingka Group

“Docker and Kubernetes play a major role when it comes to Ingka Group and developing a platform."

Yazin Faizan Shaik Mohammed
Tech Lead MLOps Platform, Ingka Group

“Kubernetes on Docker provides isolation, which is very important."

Soufiane Benzaoui
Consultant Machine Learning Engineer, Ingka Group

“The ability to move one workload from one place to another; you want to move it to one Kubernetes cluster or another cloud provider, that’s no problem."

Soufiane Benzaoui
Consultant Machine Learning Engineer, Ingka Group

“We wanted a platform that provides security, observability, scalability, and all of that. You can obviously try to do all of that in the ‘old fashioned way’, but Kubernetes and Docker provided the tooling to make all of those pillars easier to have."

Soufiane Benzaoui
Consultant Machine Learning Engineer, Ingka Group

“You may take more time at the beginning, but you’re going to see the gain in the long term. Long term you see the gain where your model will be more reliable, more observable, and if there’s an issue, you can fix it quickly."

Soufiane Benzaoui
Consultant Machine Learning Engineer, Ingka Group

“That involves the whole process, from best practices to technical tools like Docker."

Soufiane Benzaoui
Consultant Machine Learning Engineer, Ingka Group

“The struggle now with data science or machine learning is acknowledging the necessity of embracing MLOps. And that comes with all the required skills and tools and culture change as well."

Soufiane Benzaoui
Consultant Machine Learning Engineer, Ingka Group

Find a subscription that’s right for you

Contact an expert today to find the perfect balance of collaboration, security, and support with a Docker subscription.