Mirantis adds support for GCP and Ceph Storage to container platform


At KubeCon + CloudNativeCon North America conference, Mirantis today announced the addition of Google Cloud Platform (GCP) support to the managed instances of the Kubernetes clusters it provides. Additionally, Mirantis also enables the deployment of stateful applications to the open-source Ceph object-based storage software.

Shaun O’Meara, Technical Field Manager at Mirantis, explains that the company includes an abstraction layer that allows file-based applications to be deployed on top of Ceph running in the Mirantis Container Cloud to further streamline operations. IT operations by unifying storage management.

There has been a long-standing debate about the merits of deploying stateful applications on Kubernetes clusters. This debate seems to be coming to an end as more and more of these apps are deployed. In theory, IT teams can develop stateless applications that store data on external storage. However, many organizations have adopted hyperconverged infrastructure to reduce the total cost of IT by unifying compute and storage management. By adding Ceph support, Mirantis makes it possible to add stateful applications to a managed Kubernetes service that the company touts as “ZeroOps” in the sense that many routine management tasks have been automated.

The Mirantis Kubernetes Engine (MKE) that is at the heart of Mirantis Container Cloud can already be deployed in the cloud, on premises or at the edge. GCP support provides another cloud option alongside Amazon Web Services (AWS) and Microsoft Azure.

In addition to these platforms, Mirantis offers a suite of ZeroOps services to manage everything from Kubernetes and OpenStack to continuous integration/delivery (CI/CD) platforms and web application rapid deployment tools. . These services are enabled by an open-source Lagoon platform that analyzes how an application is built using containers on a local desktop or in a Git repository. Lagoon, won by Mirantis following his acquisition of amazee.io earlier this year automatically invokes the application programming interfaces (APIs) required to deploy an application to a Kubernetes cluster.

It’s not clear how well IT organizations will adopt ZeroOps to build and deploy applications, but more layers of abstraction in the form of frameworks simplify building and deploying applications on Kubernetes clusters. Automation, of course, has always been at the heart of any DevOps initiative. The problem organizations currently face is the extent to which they want to manage DevOps platforms themselves rather than relying on the expertise of an external service provider.

Whatever the approach, one thing is certain, managing Kubernetes environments is becoming increasingly automated through a set of DevOps best practices. As this trend continues, the platform itself will become more accessible to a wider range of organizations. Thus, IT teams should expect the number of cloud-native applications deployed on Kubernetes clusters to grow exponentially over the coming year.


About Author

Comments are closed.