Fotolia

Tip

How to manage distributed apps built on microservices

Distributed applications can clearly benefit from the architectural benefits of microservices, but that has traditionally come with certain drawbacks. Discover how it's become easier to manage distributed applications built on microservices due to advancements in container technology.

Distributed applications built on microservices can offer a great way to run cloud-native applications. This type of architecture for distributed apps provides real benefits -- improved control over performance, high availability, better disaster recovery and deeper visibility -- but they still come at a cost. A distributed app built on microservices is far more complex than a traditional application, so it's both an art and a science to build one that runs successfully.

Containers enable geographical distribution

Traditional applications are closely tied to their underlying infrastructures. Modern container technology changes that relationship because it lets the same application run on any kind of infrastructure by abstracting away the infrastructure layer from the application code. This transformation has ushered in the era of the multi-cloud or hybrid cloud model. Application portability was the first promise that drew developers to containers, and it's also the distinguishing feature of container technology that enables distributed microservices.

Geographically distributed applications come in every shape and size. You can centralize some aspects of the application and distribute others. With geographical distribution, you must consider a cloud vendor's availability zones. In addition to the cloud vendor's location, you may also house data in your own data centers that you can manage either locally or remotely. Apart from the physical location of hardware, the location of both the development team and end users can also help determine application performance. Therefore, it's important to choose a physical location that's closest to your app's users.

A container orchestration tool, like Kubernetes, brings together all the required components to run applications in distributed environments. Kubernetes manages cloud instances on any cloud provider, and it abstracts container instances into groups of pods that make infrastructure easy to manage and better suited for distributed apps.

In Kubernetes, each pod should have a replica pod that acts as a backup for the primary pod. The number of replicas is automatically controlled for every Kubernetes deployment using the ReplicaSet feature. Replicas also help with horizontal scalability during peak traffic times.

Manage networking as a separate layer

Container networking matured in recent years, and the biggest advantage is that it's now possible to treat concerns like storage and networking as layers separate from the application and infrastructure layers. There are a few capable Kubernetes networking tools, like Istio, Linkerd and gRPC, which each improve load balancing, service discovery and other key network processes.

Service connection tool gRPC uses HTTP/2 protocol, which enables multiple bidirectional streaming calls. In a geographically distributed microservices app where the amount of network calls is bound to be high, tools like gRPC provide the plumbing capable of handling these calls at scale.

With these advances, the management of geographically distributed applications built on microservices is now easier than ever and should be part of an effective software delivery strategy.

Kubernetes also handles networking in a way that is efficient for geographically distributed microservices apps because it follows the sidecar proxy model for networking agents. A sidecar container doesn't exist on its own but rather supports another primary container or a group of containers in a Kubernetes pod. It can be used to collect logs via a logging agent or to handle service-to-service communication using a networking agent.

In a distributed microservices app, the amount of service-to-service communication is vast and complex. Rather than have all the networking logic in the application code or the application containers, it makes sense to separate them out into sidecar containers. This distribution helps to manage them apart from the application and even let them be reused to save time later.

Latency-aware data storage

In geographically distributed apps, how you manage storage can directly impact application performance. To determine how quickly an application can process requests, it's essential to get access to remotely stored data and ensure the correct data transfer rates.

Enterprise applications will typically house tens of thousands, if not millions, of data points in systems like SAP ERP, Oracle E-Business Suite and others. For applications to access and consume these large data sets, it takes more than routing a request to the right database. The system must be configured to handle delays and microfailures along the way.

Applications that are configured for eventual data consistency work well when they can fetch data from remote locations. In this model, data is fetched in chunks, and as each piece of data reaches the front-end device, it can load without waiting for all the pieces of data. Through this way of operating, the data itself can be updated later to be consistent with the back-end database, but the focus remains on delivering a speedy UI despite dealing with large data sets.

In mobile applications, data caching is an important aspect of handling data access. It is inefficient for an app to go back and forth to request data, some of which might be duplicated. It's better if the app stores parts of the data on the client, which requires a consistent way to handle data caching for offline functionality. Fewer requests can speed up data transfer and deliver a better user experience in the application.

Geographically distributed applications that use microservices are common in today's cloud-native environments. Container technology aids this architectural setup and helps users manage aspects of data delivery, such as networking and storage, separate from the application code itself. With these advances, the management of geographically distributed applications built on microservices is now easier than ever and should be part of an effective software delivery strategy.

Dig Deeper on Enterprise architecture management

Software Quality
Cloud Computing
TheServerSide.com
Close