In This Article

There is no doubt that the cloud has brought forth one of the most significant platform shifts in the history of computing. To date, this transformation had impacted hundreds of billions of dollars in IT spend and is projected to grow with an additional annual spend of 100 billion moving forward. It is safe to say we are in the early innings of this paradigm. This shift has been driven by a powerful value proposition of immediately available infrastructure needed for business to scale. We all understand this, but how do we realize its maximum potential? Let's take a look and embark on a journey called modernization. 

As IT organizations continue to evolve, they have come to realize there are two sides to the same coin. Are they neglecting to look at optimization of applications to allow for on-demand infrastructure to better consume cloud computing managed services? This is a very counterintuitive assumption in the industry given prevailing narratives around cloud vs. on-prem data centers. However, it's clear that when you factor in the impact to perceived value in addition to near-term savings, companies that continue to scale service offerings can and should justify the costs necessary to help keep cloud operating expenses low for the long haul. 

copy link

The approach to creating value is nine-tenths of the law 

Every company in the world already has valuable data and functionality housed within its systems. Capitalizing on this value, however, means liberating it from silos and making it interoperable and reusable in different contexts -- including by combining it with assets from partners and other third parties to promote innovation. One of the bearers to entry is the elevated levels of privacy concerns. As part of an ongoing strategy, organizations ask that services fluidly integrate with retailers' apps, just as retailers want their customers to have easy in-app access and preferred service choices, but not at the cost of losing their security posture. 

The application programming interface has been elevated from a development component to a business model driver. Organizations are beginning to see the power of sharing and reuse of core assets that can be monetized through modular container-based services that align with the business need. This will result in extending the reach of existing service offerings to unlock new revenue streams. By extending selective services to outside partners and developers, APIs open even more opportunities, letting a business focus on maximizing its strengths while relying on other ecosystem participants for complementary technologies and to fill competitive or go-to-market gaps. 

copy link

How to get started on that modernization of the application infrastructure 

To really transform how Business and IT work together to deliver function through software will require organizational change that goes far beyond a single group. I have found the best approach is to identify initiatives to include: 

  1. Business function
  2. Project
  3. Budget
  4. Key stakeholder

With the above in mind try to consolidate or compartmentalize them into high-level objectives which identifying how each interact. The common theme here is to know your business.

Knowing your business
Knowing your business will allow you to organize your technology.

When you break out that monolith application into smaller more manageable subsets of the greater application, it will provide for additional visibility when aligning software applications with core business objectives. One example is if we look at a taxi-hailing web application like the one below. The core of the architecture is business logic. Despite being modular, the application is packaged and deployed as a monolith.  

An application running as a monolith
An application running as a monolith

Let us envision if we decompose the system and split it into a set of simpler container-based web applications (such as one for passengers and one for drivers). Then look at how each interacts with a service billing or payments. This makes it easier to deploy distinct experiences for specific users, devices, or specialized use cases because the change is isolated to a single piece of software that addresses a business need. 

Each backend service exposes a gRPC or REST-based API interface for north, south client (user application or developer) to service (passenger) or east, west service (driver) to service (billing). Also, this can enable a third-party ecosystem to innovate through an API gateway acting as a proxy providing role-based access and security for control.  

An application running in self-contained node services
An application running in self-contained node services using gRPC or REST to communicate

copy link

Onboard that first service non-containerized software subset of the web application 

The journey to Day 2 production operations is not an easy one. The most common challenge organizations face when it comes to adopting a container-based strategy is how to overcome the complexity when onboarding an application to a Kubernetes container platform (coupled with security concerns and scaling to meet the broader need of the business).    

An upcoming WWT lab can be a great resource that will demonstrate the onboarding of a small application into a container management platform such as Red Hat OpenShift.

It will leverage an open-source and built-in utility called "Source-to-Image" to package a simple application stored in a git repository. Then translate all required components into a container image and deploy it into a central OpenShift clustered namespace. 

RedHat OCP UI - Create an application from code
RedHat OCP UI - Create an application from code 

Image1: The feature will automatically detect the language and framework used in the software application and set up a build pipeline to create a container image containing it. The image will be stored in the built-in image registry provided by OpenShift.

RedHat OCP UI - refer to git repository
RedHat OCP UI - refer to git repository 

Image2: The container image will be stored in the built-in image registry and deploy to a cluster pod running as a node. 

RedHat OCP - compile and containerize source
RedHat OCP - compile (if instructed) and containerize source

Image3: Lastly, the deployed application in OpenShift will be generally available.

Python application running as a container instance on Kubernetes
Python application running as a container instance on Kubernetes

copy link

Role-based access and protecting the core business application assets  

Now that we've built out our services-based application into a native Kubernetes environment, how do we change our organizational process/culture to enable developers and partners to access each in a manner that promotes innovation and collaboration? By implementing a good access and security posture you can build a sandbox with guard rails for developers and partners restricting access to only the services assigned through policy. 

The below lab environments demonstrate how the modern-day industries shift towards agile methodologies to develop applications at speed.

API Management using GitOps Workflow

This lab will focus on three principal solution areas: API Management, API Gateway and API Security. With the lab's below you will work with GitLab as a DevOps platform and execute automated CI/CD pipelines to build test and deploy a three-service financial application into a native Kubernetes cluster. 

Next, we will enable organizations to incorporate security best practices using declarative CI/CD approach during the early stages of application development, and secure API workloads to manage the API lifecycle.

API Security: Declarative AWAF Policy Lifecycle in a CI/CD Pipeline Lab

In this lab, we'll demonstrate the integration of declarative AWAF policy in CI/CD pipeline. The AWAF policy is being deployed via AS3 and is protecting an API workload deployed in Kubernetes by ingesting the OpenAPI 3.0 swagger file describing the API. The GitLab CI/CD pipeline uses modern automation tools like Terraform, Ansible and F5 AS3 to deploy and configure the application workloads.

copy link

The financial services journey: Making it real in the Advanced Technology Center 

Financial technology applications have experienced a rapid rise in popularity in recent years. The appeal of these services is clear as managing consumer and business finances has become often time-consuming and even confusing. This has resulted in the proliferation of fintech startups innovating in areas such as budgeting, payments, investing and lending. 

Fintech applications provide solutions designed to simplify, resulting in fast and accessible assess to the data necessary to make good financial decisions.  

Open banking components highlighted
Open Banking components highlighted

The below up-and-coming lab really brings this solution to life using OAuth 2.0 and Open ID connect to grant client delegated access to back-end services. The beauty of this lab is can demonstrate delegation of service to differing organizational owners, resulting in role-based access control for each service aligning with the business need.   

API Security using OAuth 2.0 & OpenID connect workflow

This lab will address the use case of a delegated access to the APIs by third-party application using OAuth & Open ID connect. The backend application is deployed via an automated CI/CD pipeline with NGINX API Gateway acting as a resource server, validating the access tokens presented by the third-party application and granting access to the TPP depending on the scope that was requested. 

myPISP Payment Initiation Service Provider banking application
myPISP (Payment Initiation Service Provider) banking application

Image1: A client can initiate a third-party application called myPISP (Payment Initiation Service Provider in the context of Open Banking for FinServ) which will invoke an API transaction for money transfer service. 

Redirect to Oauth server to grant access log in
Redirect to Oauth server to grant access log in

Image2: Resource owner of service receives a request redirect from an OAuth server to grant authentication access. 

Each service can have an owner granting API level access
Each service can have an owner granting API level access

Image3: The resource owner will authorize the request to grant access to the specific service. In the context of the myPISP, it could be a money_transfer service. The IAM OAuth server could be an F5 access policy and/or Ping federate policy for example. The ATC lab environment will look at and compare both solutions to determine alignment with standard and business applications. 

The closed feedback loop of the transaction provided to the owner
The closed feedback loop of the transaction provided to the owner 

Image4: Successful API call displaying the transaction details.  

In conclusion, it's important to embrace a shift in supply and demand one in which resource-constrained strategies give way to infinitely replicable digital assets and new opportunities around economies of scale. For example, rather than limiting much of their business to physical branches, banks across the world want to make their services available to customers wherever they are by implementing this approach to managed services. 


Chris Richardson of Eventuate, Inc., May 19, 2015,  https://www.nginx.com/blog/introduction-to-microservices/