While developers drool over and start playing around with Docker, an open-source technology for packaging up code into convenient containers that can move from one server environment to another, some big companies could be sitting out of the conversation for the most part. Many IT shops run the Windows operating system on their own servers, preventing the use of Docker in such places, except when developers take to external public clouds.
Today Microsoft has declared that it’s seen the light. It’s now working with Docker to make the company’s container innovation available in data centers in the next version of Windows Server. That means developers can build new applications inside Docker containers for the development, testing, and production stages if they so choose and then, without changing their code, drag the applications in containers to one cloud or another.
“There’s a huge community of Windows developers out there that we want to help have the enabling technologies to write the next generation of applications, and do it wherever and however they want,” Mike Schutz, general manager of product marketing in Microsoft’s server and tools division, told VentureBeat in an interview.
It’s a reasonable step to take for Microsoft, which surely wants to keep companies happily paying to run Windows, rather than switching over to open-source Linux alternatives. It’s one thing for Linux-focused Red Hat to mash Docker container capability into its operating system for servers. But it’s something else, something significant and meaningful, when a non-Linux operating system provider like Microsoft works to support a project that has its roots in low-level Linux components.
For years, Microsoft circumvented open-source tools in favor of pushing proprietary technology. But Microsoft is a changing company — witness its support for the Open Compute Project, Hadoop, and even Docker integration in its Azure public cloud. So the move to bring Docker containers to Windows Server isn’t earth-shattering.
What might have been more surprising is VMware’s decision to work with Docker and make it “enterprise ready.” VMware, by the way, came out today with new benchmarks showing the power of running Docker in different capacities. Clearly the company doesn’t want to be left out of the Docker conversation.
And Microsoft doesn’t either. It wants to support the development of cloud-based applications that get divvied up into simple components. Schutz described that as the low-hanging fruit, because there’s much more. There’s all the long-existing, complicated applications that have worked just fine in virtual machines — to which Docker containers are an alternative — and will likely continue to do so.
“Over time, I think [there’s a] different use case for what enterprise applications make their way into containerized applications,” Schutz said. “I think we’re really anxious to see how that works. We’ll be working really closely with our customers on that.”
But perhaps the biggest winner here is Docker itself. As Docker containers continue to take hold, companies could start to control Docker-style applications. The infrastructure is increasingly getting abstracted away.
“You’re no longer going to need kind of separate operational tools to manage, deploy, monitor, and scale,” Scott Johnston, Docker’s senior vice president of product, told VentureBeat in an interview. “All those operational functions can now be unified across the enterprise.”
Meanwhile on the development side things can become more streamlined as well.
“Honestly,” Johnston said, “the focus is on the application, and we’re agnostic to the underlying infrastructure where Dockerized applications can land and run.”