Kubernetes – from internal project to world leader
Kubernetes’s flexibility and ability to evolve means it will continue to be at the heart of cutting-edge technologies like 5G and IoT, writes Carmine Rimi, Canonical
Kubernetes, the container orchestration system for automating application deployment, recently celebrated its fifth birthday. Starting life as an internal project at Google, Kubernetes has grown to become the largest open source project in history.
This dominance has been supported by the platform’s inherent benefits. After all, Kubernetes can scale to the requirements of virtually any business. Its ability to deploy and manage containerised applications has opened the opportunity for exciting innovations in AI, IoT, cloud and edge computing. With a huge open source development community behind it, Kubernetes is now central to the development of cutting-edge technologies across the board.
Community is king
One of the main reasons Kubernetes has become so popular is its ability to drive faster, and more reliable, innovation. Working in open source means developers can accelerate the adoption of the latest technologies in their business. This is thanks to Kubernetes’ flexible, open source nature and ability to provide the latest innovations to developers.
Behind Kubernetes is a huge, knowledgeable and creative development community at the vanguard of innovation – developers can benefit from freely exchanging ideas and concepts with this group. Through dialogue with the brightest and the best, ecosystem developers have resolved issues and ushered in the entry of exciting new technologies like AI, machine learning and robotics.
Innovation isn’t the only benefactor either, security is too. The open source community behind Kubernetes has a strong and passionate commitment to security. Indeed, the Security Working Audit Group was established specifically to support a long-term commitment to a secure and robust Kubernetes ecosystem. Developers engaging with their peers in the Kubernetes community to discuss security issues and compare best practice can only work to the benefit of IT project security.
Alongside IT security, software based on a collaborative approach from a community working holistically to drive the whole landscape is hugely powerful. This is where the next disruptive solutions will originate from.
Perfect partners – Kubernetes and AI
AI has the potential to decrease operating costs, improve business decision making and enhance the day-to-day lives of employees via workload automation.
It is one of the most promising technologies on the face of the planet, with the number of AI start-ups increasing 14-fold since 2000, according to Stanford University.
However, the launch of tools such as Kubeflow are a testament to the fact that the Kubernetes ecosystem is evolving.
Yet building AI applications is challenging. They are difficult to design and write and comprise countless types of data. Multiple steps are required at each stage to start constructing even the most basic AI application, each requiring different skills. Even when AI applications are developed, a new challenge arises in how to manage and seamlessly switch them between platforms.
Containerised applications have become the leading choice to solve these challenges. Containers provide a compact and portable environment for workloads like AI to run in. They are also straightforward to scale and move through a range of environments – from development, through test and into production. This process means that large-scale applications can be broken down into targeted and easy-to-maintain microservices.
Kubernetes has proven to be adept at orchestrating containers for AI development. For example, Kubeflow, a ML stack built for Kubernetes, launched in 2017 specifically to reduce the challenges in building production-ready AI systems, such as manual coding, combining different components from various vendors, and hand-rolled solutions. Kubeflow also reduces the difficulty in moving ML models from development into production, without any major re-architecture.
Generally, Kubernetes can scale AI algorithms for optimum effectiveness. The platform is designed to scale in line with growth, even with deep learning algorithms and data sets, which require huge amounts of compute power. It can also deploy AI-enabled workloads over multiple commodity servers, across the software pipeline, while abstracting away the management overhead.
Once the models are trained, serving them in various deployment scenarios, from edge compute to central data centres, is straightforward. It is this flexibility, alongside repeatability and fault tolerance for complex workloads that make Kubernetes the de-facto standard to manage containerised AI applications.
Containers and technological evolution
Containers are going to be at the very heart of the development of cutting-edge technologies like 5G and IoT networks, as it delivers faster deployment and the ability to scale to meet demand. Moreover, because they create a package around a piece of software that comprises everything it needs to run – that software can flexibly run in different operating environments, which allows for portability.
For 5G networks, containers must mature as they have done in the enterprise space. This is especially true of orchestration. As the leading platform for this task, Kubernetes must evolve to support the complexity and large numbers of containers 5G needs. However, the launch of tools such as Kubeflow is a testament to the fact that the Kubernetes ecosystem is evolving.
The ability to evolve and be flexible, among other benefits, is what has led to its popularity. Having recently celebrated its fifth birthday, the future looks bright for Kubernetes. With its superior ability to manage containerised apps, and a knowledgeable and passionate open source development community behind it, it is poised to be the solution of choice for the development of future technologies.