Global survey explores networking needs for AI era

Data centre experts predict at least 6X increase in DCI bandwidth demand over next five years, with 43% of new data centre facilities expected to be dedicated to AI workloads.

The rapid growth of AI workloads is driving a major transformation in data center network infrastructure, with global data center experts anticipating a significant increase in interconnect bandwidth needs over the next five years, according to a study commissioned by Ciena.

The survey, conducted in partnership with Censuswide, queried more than 1,300 data center decision makers across 13 countries. More than half (53%) of respondents believe AI workloads will place the biggest demand on data center interconnect (DCI) infrastructure over the next 2-3 years, surpassing cloud computing (51%) and big data analytics (44%).

To meet surging AI demands, 43% of new data center facilities are expected to be dedicated to AI workloads. With AI model training and inference requiring unprecedented data movement, data center experts predict a massive leap in bandwidth needs. In addition, when asked about the needed performance of fiber optic capacity for DCI, 87% of participants believe they will need 800 Gb/s or higher per wavelength.

"AI workloads are reshaping the entire data center landscape, from infrastructure builds to bandwidth demand," said Jürgen Hatheier, Chief Technology Officer, International, Ciena. "Historically, network traffic has grown at a rate of 20-30% per year. AI is set to accelerate this growth significantly, meaning operators are rethinking their architectures and planning for how they can meet this demand sustainably.”

Creating More Sustainable AI-Driven Networks

Survey respondents confirm there is a growing opportunity for pluggable optics to support bandwidth demands and address power and space challenges. According to the survey, 98% of data center experts believe pluggable optics are important for reducing power consumption and the physical footprint of their network infrastructure.

Distributed Computing

The survey found that, as requirements for AI compute continue to increase, the training of Large Language Models (LLMs) will become more distributed across different AI data centers. According to the survey, 81% of respondents believe LLM training will take place over some level of distributed data center facilities, which will require DCI solutions to be connected to each other. When asked about the key factors shaping where AI inference will be deployed, the respondents ranked the following priorities:

· AI resource utilization over time is the top priority (63%)

· Reducing latency by placing inference compute closer to users at the edge (56%)

· Data sovereignty requirements (54%)

· Offering strategic locations for key customers (54%)

Rather than deploying dark fiber, the majority (67%) of respondents expect to use Managed Optical Fiber Networks (MOFN), which utilize carrier-operated high-capacity networks for long-haul data center connectivity.

"The AI revolution is not just about compute—it’s about connectivity," added Hatheier. "Without the right network foundation, AI’s full potential can’t be realized. Operators must ensure their DCI infrastructure is ready for a future where AI-driven traffic dominates."

Digital twin provides enhanced insight and control over AI Factory electrical systems and power...
New research shows just a quarter of UK firms have delayed AI initiatives due to regulation,...
Workplace research finds employee productivity a top priority, though current systems sorely...
With less than half of consumers happy with the service they receive and only 16% of agents...
Boomi has launched Boomi AI Studio, a secure AI management solution that allows organizations to...
ServiceNow’s agentic AI and automation strengths plus Moveworks’ front‑end AI assistant and...
More than one in two organisations acknowledge gap in understanding how digital technology can...
The Riverbed Global AI & Digital Experience Survey explores the trends, challenges, gaps and...