In this article

With today's modern application architectures, the evolution of services that support the underlying infrastructure to provide a positive end-user experience is critical to success.

Achieve flexibility with service-oriented architecture

As the application evolves, more organizations are migrating from a monolithic approach when developing applications to more of a flexible component-based service-oriented architecture. Doing so provides visibility to align functional areas as well as the flexibility to react to code changes as part of a software development life cycle (SDLC).

Our customers are looking for ways to simplify their transition to modern application architectures. In this first part of a series of articles, we'll discuss one of the most common solutions: providing a single, consistent entry point for multiple restful (representational state transfer) APIs, regardless of how they are implemented or deployed at the back end.

Handling API requests across the network

Today application servers are load-balanced globally across multiple data centers using L4-L7 traffic controllers enabling GSLB (global server load balancing). One of the more common technologies for GSLB is F5 BIG-IP DNS, previously known as GTM. One of the challenges commonly associated with microservice architectures is the exponential increase in the number of built-in upstream API requests across the network. 

This challenge is a result of the transition from a monolithic application, which resides on many physical or virtual servers, to a microservice architecture breaking up that monolith into manageable parts. How can we orchestrate the evolution of these services and migrate traffic without disrupting the end user experience?

Enter NGINX Plus

NGINX Plus is used in many organizations to maintain or improve resiliency and overall system performance as an API Gateway. While providing the end user with a single funneled endpoint into the system, the benefits of this solution on the back end can facilitate multiple roles when managing Restful API requests.

In this use case, NGINX Plus is acting as an inbound/outbound proxy built into a single tier. Using an edge proxy will not only simplify the transition to a services-oriented architecture but will enable DevOps to react to new requirements by quickly adding features based on criteria.

Leveraging NGINX Plus as an API Gateway can provide value in three core areas when enabling microservices as part of your application.

  • Service Agility or Assurance: Intelligently redirect or load balance traffic across data centers based on actions or specific criteria. Actions are used to identify a healthy microservice tier for each datacenter and based on metrics, such as HTTP error rate, error response codes or even the health of API response. You can also use regex to parse critical, which are sent to end-user and non-critical, which will track status for reference.
  • Auto Retry: Retry a request to a targeted service id where the response is, for example, 500 code. A developer can set the retry timeout and take action by either sending a response error downstream or spin up a net new service.
  • Content Caching: Store content close to the user, therefore reducing the number of requests per region and the overall retrieval time for requests, ultimately improving the user experience.

These features are some of the many functional services used to increase overall application performance as an edge proxy.

In conclusion, NGINX Plus is the on-ramp allowing for expansion into areas such as an Ingress Controller for Kubernetes container management type environments, including Amazon Elastic Container Service for Kubernetes (EKS), the Azure Kubernetes Service (AKS), Google Kubernetes Engine (GKE), Red Hat OpenShift and others, in preparation for a platform to support multicloud architectures.  

For a deeper dive you can refer to the NGINX resources below:

NGINX Cookbook

https://www.nginx.com/resources/library/complete-nginx-cookbook/?utm_campaign=core&utm_medium=ebook&utm_source=WWT&utm_content=partner

NGINX Plus Free Trial

https://www.nginx.com/free-trial-request/?utm_campaign=plus&utm_medium=free-trial&utm_source=WWT&utm_content=partner

Controller Free Trial

https://www.nginx.com/free-trial-request-nginx-controller/?utm_campaign=controller&utm_medium=free-trial&utm_source=WWT&utm_content=partner

NGINX Blog

https://www.nginx.com/blog/?utm_campaign=core&utm_medium=blog&utm_source=WWT&utm_content=partner

Technologies