Docker announces significant upgrades to its subscription plans, delivering more value, flexibility, and tools for customers of all sizes.
Products
Docker Desktop 4.26: Rosetta, PHP Init, Builds View GA, Admin Enhancements, and Docker Desktop Image for Microsoft Dev Box
The Docker Desktop 4.26 release delivers the latest breakthroughs in Rosetta for Docker Desktop optimization and boosts developer productivity by solving common issues such as Node.js freezes and PHP segmentation faults.
Building Spring Boot’s ServiceConnection for Testcontainers WireMock
Learn about the benefits of using Testcontainers and WireMock for simulating API behavior during testing.
Accelerating Developer Velocity with Microsoft Dev Box and Docker Desktop
We’re pleased to announce our partnership with the Microsoft Dev Box team to streamline developer onboarding, environment set-up, security, and administration with Docker Desktop.
The Livecycle Docker Extension: Instantly Share Changes and Get Feedback in Context
Livecycle’s Docker Extension makes it easy to share your work in progress and collaborate with your team. We provide step-by-step instructions for getting started with the Livecycle Docker Extension.
Running Testcontainers Tests on Buildkite
This article explains how to run your Testcontainers-based tests on the Buildkite CI/CD platform using an Ubuntu VM as an agent.
How JW Player Secured 300 Repos in an Hour with Docker Scout
For companies like JW Player, whose core business revolves around streaming, content, and infrastructure, security must be a priority without slowing down delivery or affecting operations. Learn how JW Player uses Docker to help meet such challenges, including how JW Player enabled more than 300 repositories for Docker Scout within just one hour.
Achieve Security and Compliance Goals with Policy Guardrails in Docker Scout
We show how Docker Scout policies enable teams to identify, prioritize, and fix their software quality issues at the point of creation.
LLM Everywhere: Docker for Local and Hugging Face Hosting
We show to use the Hugging Face hosted AI/ML Llama model in a Docker context, which makes it easier to deploy advanced language models for a variety of applications.