Why container-management platforms are on a 30% a year growth spurt

Source:-telecomtv.com

In an attempt to overcome management challenges associated with the adoption of agile development methodologies, organizations are using container-management platforms, spurring the market to expand at a compound annual growth rate (CAGR) of 30 percent from 2018 through 2023, according to Omdia.

The adoption of cloud-native technologies using a microservices architecture is increasing the agility and flexibility of organizations by enabling the delivery of more frequent changes to meet the demands of businesses. This utilization of a microservices architecture for application development has in turn spurred interest in the use of technologies such as software containers and Kubernetes.

In a recent Omdia survey of 700 end users, 60 percent of respondents stated they had adopted agile development methodologies. However, the processes are not mature enough to fully exploit the capabilities and deliver the speed these technologies—i.e., containers and Kubernetes—can achieve. A total of 32 percent of organizations stated that a lack of cloud-native skills is delaying more rapid adoption of microservices-based applications. These challenges are fueling the robust growth of the container management platform market, as reported by the Omdia Market Radar: Container Management Platforms 2018/19 report.

**Enter the service mesh ** However, as the use of containers expands, so does the complexity of managing service-to-service communications. This development has led to the birth of the service mesh. However, what exactly is a service mesh and how does it integrate with the existing container management platforms?

The open source community has been working to solve the challenges of managing service-to-service communications in the container ecosystem with the rise of Google’s Istio, HashiCorp Consul, and Linkerd, but the market has expanded in 2020. Omdia has identified seven of the leading service-mesh capabilities that provide differing levels of control and technical knowledge required to operate.

“The adoption of service-mesh technology is being inhibited by the uncoordinated manner in which the service mesh is deployed and integrated as part of any cloud-native environment,” said Roy Illsley, chief analyst, cloud and data center research practice, at Omdia. “As the maturity of the ecosystem evolves the complexities of deployment, the process of managing and securing a service mesh must be simplified. Only then will this open the cloud-native technology to a wider audience, and therefore enable more organizations to deploy and manage microservices in production environments.”

**An important shared infrastructure ** In a microservices architecture, applications or services need to communicate with each other. This can be achieved through a central control point, but as the size and number of microservices grow, this becomes a performance and administrative bottleneck with traditional networking infrastructure. A service mesh is a new and separate infrastructure layer that is built into each application, and because the service mesh has visibility of all communications, it is the ideal layer to ensure the environment is optimized.

The basic concept of microservices is that each application, or service, is reliant on other services to deliver the outcomes. For example, in a microservices architecture, if users request train tickets online, they need to know the times of the trains, the cost of the tickets and the routes of the trains. Therefore, the user-request service on the web page will need to communicate with the route database, which will need to communicate with the schedule and then the pricing database (with special offers also having to be interrogated). Finally, this will need to populate the user online order cart. To make this website more user friendly, the service may also want to make suggestions to the user based on history. All these different services are separate microservices that must locate each other and deliver the required function based on the context of the request.

The challenge in terms of performance and scalability with this networked approach to services, where each service is performing a specific function, is dealing with request overload. Each service might need to request data from several other services, and if one service, such as the schedule service above, is central to all service requests, then managing the traffic flows becomes an issue. The service mesh is designed to route and optimize traffic in these scenarios with minimal overhead.

**What to look for in a service mesh ** Service meshes are designed to solve the challenges of managing a large number of intra-service communications in a microservices architecture. While there are a number of different solutions available, Omdia considers that the core capabilities/architectural best practice of a service mesh, as shown in the chart, must include the following:

Control plane and data plane separation
Dynamic routing
Service discovery
Load balancing
Observability and traceability
Security
Interoperability

Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x