This article was written and contributed by, Akamai. 

Securing the data center used to be simple, at least on paper: Build a strong perimeter, deploy firewalls at the north-south boundary, and control what enters and exits the network.

That model worked when applications were monolithic, data stayed in one place, and artificial intelligence (AI) was a future ambition rather than an infrastructure reality.

But those days are over. Today, AI-powered data centers, high-performance machine learning workloads, and cloud native architectures generate more internal traffic than ever before. More than 76% of data center traffic now flows east to west, moving between GPUs, endpoints, APIs, datasets, and internal services. It no longer crosses the perimeter.

That shift has exposed serious security risks.

The rise of east-west traffic in AI infrastructure

In modern data centers, traffic patterns have changed significantly. Distributed AI systems rely on real-time data exchange among training nodes, orchestration tools, and storage systems. These connections occur deep inside the network between workloads rather than across networks. 

For example, traffic may go from Server 1 to Server 2, which may be in the same server rack. Traffic never leaves the rack or traverses the network to cross the firewall (or legacy segmentation solution).

A single AI model training session can generate terabytes of east-west traffic as datasets move across GPU clusters, container environments, and scalable compute fabrics. The tolerance for latency is extremely low. Traditional perimeter tools weren't designed for this level of throughput or complexity.

More significantly, most perimeter tools can't see this traffic at all. As a result, it's important to ask these questions:

  • Do our current firewalls protect our AI pipelines, or just our front door?
  • Is our cybersecurity strategy built for east-west traffic, or is it still anchored in north-south assumptions?

From lobby security to departmental access: A new metaphor for segmentation

Imagine your data center as a large, high-security office building.

North-south traffic is like visitors entering and exiting the building. They pass through physical security, badge readers, or reception. Access is logged and monitored. This is where traditional firewalls operate, screening what comes in from the outside.

But once inside, the visitors move freely throughout the building. Some enter human resources offices, others go into the finance or engineering departments. But not everyone should have access to payroll systems or sensitive information.

That internal movement, which represents workload-to-workload communication, is east-west traffic. And in most data centers, it is barely secured, if at all.

Here is what that means for your AI infrastructure:

  • Lateral movement becomes easy. Threats can quickly move from one function to another without detection.
  • Insider threats go unnoticed. Internal access is often assumed to be trusted, increasing the chance of tampering or data leaks.
  • AI pipelines and datasets remain exposed. Sensitive data and metadata used in training models aren't monitored with the same rigor as external traffic.
  • Compliance scope expands. Without proper segmentation, proving compliance with frameworks like National Institute of Standards and Technology (NIST) or Health Insurance Portability and Accountability Act (HIPAA) becomes difficult and expensive.

Even with endpoint agents, virtual firewalls, or traditional network security measures, internal traffic among services often bypasses deep inspection because of performance trade-offs.

Rethinking the architecture: Secure where the traffic lives

If the majority of your data center traffic flows east to west, your security controls must be positioned accordingly.

That is exactly what the integration of Akamai Guardicore Segmentation with the Aruba CX 10000 smart switch, powered by AMD Pensando data processing unit, is designed to deliver.

This approach embeds microsegmentation enforcement directly into the data center switch at the top of every rack. As traffic enters the network fabric, it is inspected, validated, and matched to policy in real time. There's no need to redirect traffic to centralized firewalls, hairpin traffic, or consume resources on software agents.

With this model, you can gain:

  • High-speed, in-fabric enforcement built for both modernized applications and AI workloads
  • Automation of policy creation and lifecycle management
  • End-to-end visibility across the AI infrastructure ecosystem
  • Reduced attack surface without sacrificing performance

It's a smarter and more scalable way to defend against unauthorized access, cyberattacks, and supply chain vulnerabilities. This model is especially critical in hyperscale and AI-driven environments.

More questions that every data center operator should ask

  • How do we monitor and control east/west movement between workloads today?
  • Are our existing security measures optimized for AI, or are they built for yesterday's data center?
  • Can we detect security incidents across dynamic containers, APIs, and model training environments?
  • What happens to sensitive data once it moves past the perimeter?
Learn more about AI Security and Akamai Contact an Expert

Technologies