In this article

Here we look at the current dilemma with data flows, who it impacts and what can be done about it. 

What problems are organizations facing today?

Data constantly expands, not only in size but in complexity of source and destination. Of the many reasons for this increasing complexity, most prevalent are related to the end user and the application landscape:

  • The global pandemic has necessitated the enablement of remote working – end users now need high-bandwidth remote access to critical systems.
  • Applications have become decentralized, existing across multiple public and private cloud infrastructures, as well as on-prem.

These trends have proved to be the catalyst for many IT and security initiatives, including data center migrations as well as a zero trust approach to security, which incorporates segmentation, secure remote access, and SASE (Secure Access Service Edge), to name a few. 

Of course, the global skills shortage has also led to a clamor for AI/ML and the automation benefits they bring, particularly cost management and speed to resolution. Yet, the key to harnessing all of this is good data—or visibility, validation and control to be more precise–because it enables speed of execution while simultaneously reducing risk.

The problem is most businesses are struggling to get a view of the application workflows and network traffic paths that exist across their environment. Furthermore, organizations are challenged with uncertainty of data validity within in their CMDB (Central Management Database) – when this data is supposed to be a consolidated source of truth.

Who do these challenges impact and how?

The burden of these challenges does not belong to one team or another.  But rather, we are seeing a convergence of requirements across infrastructure and security teams, with the operations team smack in the middle, handling the day-to-day innerworkings of the organization. 

For these teams, good data should be lending itself to fulfill the following requirements:

  • Infrastructure dependency mapping – understanding what traffic paths are in place and creating a hierarchy of infrastructure are critical to a host of requirements, including:
    • Quickly establishing the cause of end-user performance issues.
    • Infrastructure lifecycle management, addressing technical debt and implementation of change.
    • Internal cross charging and cost allocation.
  • Application dependency mapping – more than just the observability of application workflows, ADM is the identification of valid application workflows and is essential to:
    • The identification and alerting of anomalous traffic flows synonymous with an attack or threat actor.
    • Control and limitation of the blast radius associated with a breach.
    • Policy creation and deviation monitoring.
  • Data center migration – accelerating the pace at which organizations can realize business enablement and cost management associated with data center migrations requires:
    • A mechanism to ensure applications can be migrated with minimal disruption and risk.
    • Legacy applications to be identified and decommissioned with certainty of impact.
  • CMDB validation – many organizational necessities are reliant on the integrity of an organization's CMDB data, including:
    • Reducing risk of a compromise.
    • Enabling seamless and efficient business change.
  • Acceleration of zero trust adoption – key tenants of a zero-trust strategy, such as network segmentation and micro segmentation, require clear visibility of traffic flows as a pre-requisite to effective enforcement.
  • Change management – effective change management processes require a means to simulate the impact of a given change ahead of administering it real time.
  • Cyber resilience – knowing what normal looks like and having control of your data flows are key first steps in hardening your environment to enable early detection, as well as creating a platform for recovery when a cyber incident occurs.

Most modern-day initiatives are only as good as the foundation they stand on. That is why it is imperative that, on the most basic level, there is adequate visibility, validation and control of data. Optimizing data visibility, validation and control is the prudent first step that enables organizations to move forward on a sound footing and achieve security goals, such as governance and compliance. It also promotes faster, effective business change and reduces the risk and impact of both new and legacy systems. 

Aren't application dependency mapping and observability solutions widely available already?

While terms like ADM and observability can prove to be useful at top level, they can also lead to misunderstanding when the definition is open to interpretation. At WWT, we think of this subject in terms of having the ability to understand what normal should look like and then mapping the pieces accordingly. This enables effective anomaly detection and east-west traffic flow validation. 

How can these challenges be addressed most effectively? 

One of the key challenges for businesses today is the sheer volume of investment they have already made and the fragmentation that persists. Research suggests companies have, on average, 45-75 security tools alone, an overload that can lead to process overlap and gaps.  Therefore, WWT encourages "Lab-as-a-Service" as part of our solution offering for many of our clients. 

Within WWT's Advanced Technology Center (ATC), we work with customers to develop an environment that replicates their eco-system to test products and solutions safely.  This also serves as a means of eliminating process overlap, stress-testing security protection mechanisms as a whole and developing cohesive infrastructures.

A final note

At the cornerstone of WWT's philosophy is sustainability, which fuels our drive to help customers leverage their existing investments to make the most of their current infrastructure. For this reason, we start by ingesting existing telemetry for purposes of aggregation, something that is imperative to providing context, turning information into intelligence. 

By harnessing these data sources, we can combine disparate data sets and determine if the relationships that exist are relevant and necessary. From there, it's possible to create and continuously refine policy so that the data remains valid on an ongoing basis.

Ready to discover how we can help you address the basics of data visibility, validation and control?
Contact us today!