The future of the data center is cloud native

Michael Ward - April 13, 2022 - ,

When we think of a data center, some will think of a large building housing racks of servers, networking gear, and various support equipment to deliver power and cooling to all of that infrastructure. Others may think of it a bit more abstractly, as a “place” where we store our data and perform compute functions on that data.

However, the concept of a data center is evolving. No longer are they bound to the confines of a physical building. Historically, we kept them on-prem. But as we move to the cloud, do we still need to own our own data centers?

The arrival of Edge computing has changed the way we think about our applications, and in turn how we think about our data centers. We need solutions that allow us to situate our data and compute where it is best located – on-prem, on the edge, or in the cloud – and this is rapidly driving demand for hybrid and multi-clouds.

Why hybrid clouds matter

Why do hybrid clouds matter? In other words, what meaningful benefits and outcomes do they deliver?

  • First, hosting your data across multiple clouds provides you with leverage – empowering you with choice instead of being locked into a single cloud or hosting provider.
  • Hybrid clouds deliver the benefit of efficiency – you have the ability to place your workloads in locations with the greatest number of available resources and the best prices.
  • Hybrid clouds enable risk mitigation – by spreading your workloads across multiple providers, you lower the risk of technological, economic, or political disruptions bringing down your services.
  • More providers to choose from gives you the ability to place your workloads closer to your customers, which improves performance and can open up new markets and enable you to deliver new services.
  • Finally, with more choice you’re also better able to ensure regulatory or jurisdictional compliance for the location and control of your data and compute needs.

The future of the data center is cloud native

So how do the concepts of Cloud Native fit into all of this? First, cloud native is built on the concept of microservices-based application architecture. This is the concept of decomposing your application into smaller, micro-components that can then be dispatched across multiple compute locations.

This is critical, as the need for application mobility is key to leveraging choice. Workloads should be hosted where they are needed – not just where the data centers are. Certain parts of your application may need to be located close to your users – i.e. on the Edge to ensure lower latency – whereas other parts, perhaps containing sensitive data, may need to be located on-prem where you have direct security control over the data.

Containers and Kubernetes are becoming the de facto building blocks for creating and delivering cloud native applications. Open source is accelerating the transition to an era in which deployments are not only rapid but also cheap, secure, robust, and highly performant.

Open source and cloud

Now, let’s talk more about open source and its impact on the Cloud. Open source is critically important to the success of cloud native approaches. A vast majority of businesses use open source software today – although many don’t even realize how deeply it is embedded in their business.

Open source enables speed, allowing developers and operators to deliver new applications and services faster than ever before. No longer does every company need to develop the “foundational” software upon which they build their proprietary technology. Open source allows them to focus on delivering unique value that differentiates their business from the competition.

It is true that open source development is inherently “selfish” – with contributors driving the community for solutions that help themselves – but collectively, we all benefit from this work.

Using open source does come with its own challenges, such as:

  • Choosing the right components for your specific needs from the multitude of projects out there
  • Verifying that the open software is robust, of production quality, and will scale to your needs
  • Ensuring that what is deployed is secure and stays that way
  • Having the right skills within your company to deploy and maintain these solutions

Our vision of the future

As Mirantis looks towards the future, what do we see unfolding?

We see a world in which computing resources are no longer restricted to your private data center or to one cloud but is where it needs to be, when it needs to be. Computing power will be everywhere – in the on-prem data center, a cloud data center, on the Edge, or perhaps even in your hand.

We see portable workloads – ones that are truly transferable and able to move across both compute platforms and hosting environments and that take advantage of technologies that allow them to run on any available platform.

We see a world where workloads run closer to where they are needed – where the applications themselves have an inherent understanding of the user’s needs in regards to factors like security, latency, and performance.

And most importantly, we see a world built upon open ecosystems – open standards that enable developers to easily consume and deploy services across these many providers, locations, and solutions.

Architecture requirements

What is our approach to this new cloud native data center at Mirantis? To deliver upon the cloud native data center of the future in an open source, hybrid/multi-cloud model, we established the following key requirements:

  • Applications must be infrastructure agnostic. They must be able to run on the infrastructure of choice, but also transferable, in real-time, across infrastructures.
  • There must be a centralized resource directory to provide one source of truth on infrastructure resources available to you at a particular moment in time, as well as the ability to manage your hybrid/multi-cloud infrastructure all from a single pane of glass.
  • There must be a common API across these disparate infrastructures so that your applications can be abstracted from any unique nuances of specific providers.
  • The platform must be built upon open source so that it can continuously evolve in this cloud native world.
  • It must be simple and intuitive to use.
  • It must allow for self-service so that it can be directly managed and controlled.
  • But it must also allow for the infrastructure to be fully operated and managed by others so that companies can focus their precious IT and development resources on building business-differentiating applications, not building and running their infrastructure.

Mirantis Flow

Mirantis has delivered on these requirements with our Mirantis Flow offering – a Cloud Native Data Center as a Service. Mirantis Flow combines various Mirantis products, such as Mirantis Container Cloud and Mirantis Kubernetes Engine, into a robust platform that enables developers to rapidly deliver modern, cloud native applications built around containers and Kubernetes. Mirantis Flow is extremely flexible, giving developers maximum latitude in deciding how they want to combine cloud, on-prem, hosted bare metal, or Public Cloud providers to construct their platform’s infrastructure.

Diagram of Mirantis' product suite

To complement this, we include products such as Mirantis Secure Registry, which ensures the highest levels of security and data integrity of applications, along with other tools that enable a modern secure software supply chain. We also ensure you have the needed visibility into your infrastructure by providing tools such as Lens and StackLight.

For organizations which are not “all-in” with Cloud Native at this time, we are the only provider that offers the ability to bridge their legacy virtualization environments with their container-based infrastructure as they work to transition towards a Cloud Native world. Most importantly, we provide the ability to do this all from a single pane of glass for deployment, monitoring, and maintenance.

We wrap all of this with Mirantis’ leading OpsCare and OpsCare Plus support and managed service offerings, giving organizations the choice of a “co-pilot” support experience, or fully-managed remote operations for their deployments.

The future of the data center is here today

In conclusion, cloud native is placing new and evolving requirements on how we think about, deploy, and manage our data centers. We must now think of data centers not as a physical entity but as a service that we consume. We must think about them more as a broad collection of resources that deliver the right services, with the right characteristics, at the right place and right time. And we must think of how we leverage open source technologies to make all of this possible at the speed of modern business.

With Mirantis Flow, we are delivering the Cloud Native Data Center as a Service of the future, today.

To learn more about Mirantis Flow and other offerings from Mirantis which can bring your data center into the future, please visit us at https://www.mirantis.com/software/mirantis-flow/

banner-img
From Virtualization to Containerization
Learn how to move from monolithic to microservices in this free eBook
Download Now
Radio Cloud Native – Week of May 11th, 2022

Every Wednesday, Nick Chase and Eric Gregory from Mirantis go over the week’s cloud native and industry news. This week they discussed: Docker Extensions Artificial Intelligence shows signs that it's reaching the common person Google Cloud TPU VMs reach general availability Google buys MobileX, folds into Google Cloud NIST changes Palantir is back, and it's got a Blanket Purchase Agreement at the Department of Health and Human …

Radio Cloud Native – Week of May 11th, 2022
Where do Ubuntu 20.04, OpenSearch, Tungsten Fabric, and more all come together? In the latest Mirantis Container Cloud releases!

In the last several weeks we have released two updates to Mirantis Container Cloud - versions 2.16 and 2.17, which bring a number of important changes and enhancements. These are focused on both keeping key components up to date to provide the latest functionality and security fixes, and also delivering new functionalities for our customers to take advantage of in …

Where do Ubuntu 20.04, OpenSearch, Tungsten Fabric, and more all come together? In the latest Mirantis Container Cloud releases!
Monitoring Kubernetes costs using Kubecost and Mirantis Kubernetes Engine [Transcript]

Cloud environments & Kubernetes are becoming more and more expensive to operate and manage. In this demo-rich workshop, Mirantis and Kubecost demonstrate how to deploy Kubecost as a Helm chart on top of Mirantis Kubernetes Engine. Lens users will be able to visualize their Kubernetes spend directly in the Lens desktop application, allowing users to view spend and costs efficiently …

Monitoring Kubernetes costs using Kubecost and Mirantis Kubernetes Engine [Transcript]
LIVE WEBINAR
Getting started with Kubernetes part 2: Creating K8s objects with YAML

Thursday, December 30, 2021 at 10:00 AM PST
SAVE SEAT
LIVE WEBINAR
Istio in the Enterprise: Security & Scale Out Challenges for Microservices in k8s

Presented with Tetrate
SAVE SEAT
Mirantis Webstore
Purchase Kubernetes support
SHOP NOW