The explosive growth of AI applications is reshaping data center, edge and cloud networks.
As companies put more AI into production, the explosive growth of the technology at both the edge and in the data center is creating demands for bandwidth, latency, and architectural flexibility that traditional networks werenโt designed to deliver.
The latest data from Omdia finds that all AI traffic โ including net new AI applications and AI-enhanced applications โ accounted for 39 exabytes of total network traffic in 2024. Non-AI traffic of AI-enhanced applications totaled 131 exabytes, and conventional application traffic accounted for 308 exabytes, according to Omdia research director Brian Washburn.
In 2025, those 39 exabytes of AI traffic will double to 79 exabytes, Omdia expects, and AI traffic will continue to grow at a rate far outpacing conventional traffic. In 2031, AI traffic will overtake conventional traffic, Washburn predicts.
Net new AI traffic includes such use cases as apps driven by visual processing, surveillance, new games and media, and AI content generation. AI-enhanced traffic includes smart transcription services and content summaries, code assistance and review, intelligent analytics, natural language queries, and content filters. And that doesnโt include fully private networks such as hyperscaler operation, on-prem traffic, and campuses.
For all enterprise traffic, a recent source of data is Zscaler, which released a report in March that showed an unprecedented 3,464% increase in enterprise AI activity over the course of one year. According to the report, in the last 11 months of 2024, 3,624 terabytes of data was transferred to and from over 800 AI applications such as ChatGPT.
One company thatโs already seeing an impact is Salesforce, which has added both generative AI and agentic AI capabilities to its cloud-based CRM platform.
โWeโre seeing a significant increase in data processing and transfer, particularly as we handle larger datasets for model training and real-time inference,โ says Paul Constantinides, the companyโs executive vice president of engineering. This translates to a demand for higher bandwidth, lower latency, and more robust network infrastructure.
All this increase in AI activity will make enterprises rethink their data center networking, the cloud networking, their edge networking, and their network security.
AI networking in the data center
In data centers, AI poses two different types of networking challenges. For model training, thereโs a lot of traffic between individual GPUs and servers.
โThe demand for massive amounts of resources โ particularly CPU and GPU โ is driving a new zone within the enterprise data center dedicated to AI,โ says Lori MacVittie, distinguished engineer and chief evangelist in the Office of the CTO at Seattle-based F5 Networks. โThese AI factories have special networking needs related to traffic steering that require smarter networking, new security capabilities, and the ability to handle higher volumes of data.โ
Thatโs going to translate to a lot of new spending.
โAI, and particularly generative AI, is a major growth driver in data center Ethernet switching,โ says Brandon Butler, senior research manager for enterprise networks at IDC, in a March report. Thatโs causing a renaissance in the data center portion of the Ethernet switch market, he says. The research firm is forecasting growth in the generative AI data center Ethernet switch market from $640 million in 2023 to more than $9 billion in 2028.
In addition, enterprises are beginning to experiment with agentic AI. Agentic AI is where individual AI-powered agents collaborate to carry out complex tasks, create code, or execute entire business workflows. It is often done on prem or in private clouds in order to reduce costs and latency โ and to keep all corporate data safe and secure.
Agentic AI traffic flows are expected to be dramatically different from the predictable, deterministic traffic created by traditional applications, though itโs not clear yet what exactly those differences will be.
โHow all these connections are going to flow across the network is unknown โ and almost canโt be known if the agentic AI can orchestrate based on what needs to be done,โ says F5โs MacVittie. โYou canโt predict it,โ
AI in the cloud
Once models are put into production, however, the traffic will flow outside the data center, between the models and the end users.
โInference requires strong wide area and multi-site connectivity, which is different than trainingโs network topology, which requires dense local networks,โ says Jason Carolan, chief innovation officer at Charlotte, NC-based Flexential.
And flexibility is key, he adds. โSince many AI workloads are in proof-of-concept or experimentation mode for some time, network connections, topology, and capacity needs may change based on new models, new data, or new end-points,โ Carolan says.
Some enterprises are already prepared to handle AI traffic, says Derek Ashmore, application transformation principal at Asperitas Consulting. Thatโs because theyโve already begun to move away from inflexible, hard-to-maintain legacy networks, he says. The move to modern cloud networking has been going on for a while, he adds, and was kicked into overdrive during the COVID pandemic. โEven without COVID, that move would have happened, it just would have happened at a slower rate,โ Ashmore says.
Thatโs a good thing, since it sets up enterprises for the challenges coming with generative AI.
For example, multi-modal AI applications process text, images, audio and video โ and queries and responses can be very large. For example, Googleโs latest Gemini 2.5 model has a context window size of a million tokens, with two million coming soon.
Two million tokens is around 1.5 million words. For reference, all the Harry Potter books put together contained around one million words. Big context windows allow for longer, more complicated conversations โ or for AI coding assistants to examine larger portions of a code base. Plus, the AIโs answers are dynamically generated, so thereโs no way to cache requests in most instances.
As AI companies leapfrog each other in terms of capabilities, they will be able to handle even larger conversations โ and agentic AI may increase the bandwidth requirements exponentially and in unpredictable ways.
Any website or app could become an AI app, simply by adding an AI-powered chatbot to it, says F5โs MacVittie. When that happens, a well-defined, structured traffic pattern will suddenly start looking very different. โWhen you put the conversational interfaces in front, that changes how that flow actually happens,โ she says.
Another AI-related challenge that networking managers will need to address is that of multi-cloud complexity.
โWe have a dispersion in terms of different hyperscale clouds, private clouds, and specialty clouds that just do special things,โ says Zac Smith, former Equinix executive and community member at New York-based Sustainable & Scalable Infrastructure Association.
For example, there are companies like CoreWeave that offer cloud-based GPUs. There are database companies and data lakes. There are AI platforms provided by hyperscalers, and thereโs AI running on-prem, in colos, and in private clouds.
โThese are all new environments, and people now have to solve connectivity issues among very different types of clouds,โ says Smith.
Smith recently researched the different networking paradigms of Amazon and Google. โThere are a lot of similarities,โ he says. โYou can connect to other third parties in the same regions, peer, do fabric โ but they all have different ways of doing it and itโs not all normalized.โ
AI at the edge
Finally, thereโs edge AI, which poses its own set of networking challenges. Latency becomes critical, especially for mission-critical applications like self-driving cars, factory robots, and medical devices.
Other enterprise use cases for AI workloads include AI-powered security controls for video surveillance cameras and quality control in manufacturing environments, says Flexentialโs Carolan. Or a retail beauty store might have a platform to allow customers to virtually try products, he says.
Edge AI requires processing capabilities closer to data sources to reduce latency and bandwidth usage, says Salesforceโs Constantinides. Low-latency edge networks, like CDNs, can help, he adds.
AI and network security
AI brings in a whole host of potential security problems for enterprises. The technology is new and unproven, and attackers are quickly developing new techniques for attacking AI systems and their components.
Thatโs on top of all the traditional attack vectors, says Rich Campagna, senior vice president of product management at Palo Alto Networks. At the edge, devices and networks are often distributed which leads to visibility blind spots,โ he adds. That makes it harder to fix problems if something goes wrong.
Palo Alto is developing its own AI applications, Campagna says, and has been for years. And so are its customers. โFor example, I recently met with a retailer who is rearchitecting its store networks to support AI-powered inventory management at the edge,โ he says.
Networks need to adapt, he says. โEnsure that, regardless of where the asset is deployed, there are protection mechanisms in place, as close as possible to that asset.โ
And all the security challenges are magnified with agentic AI.
Itโs a problem that F5โs MacVittie is already seeing. For example, when a company operates on zero trust and least privilege, how does it handle agent identities, credentials, and privileges? โAll the traditional tools we use to enforce roles and credentials donโt work suddenly because they donโt have roles or credentials,โ MacVittie says. โOr we give them root access โ and that makes security folks twitch.โ
As AI proliferates across internal networks, the need for fine-grained security becomes critical, says Sanjay Kalra, product leader at Zscaler.
But thereโs another network security aspect to AI โ the possibility that employees might upload sensitive data to public AI platforms or apps.
According to Kalra, Zscalerโs enterprise customers blocked 60% of all AI transactions. Some companies turn off all access to public AI apps, while others look for indications that an employee is sharing financial data, personally identifiable information, medical data, or source code. Zscaler blocked 2.9 million attempts to upload this kind of data to ChatGPT alone. The most common DLP violation? Social security numbers.
Finally, thereโs one more type of unwanted AI traffic haunting enterprise networks: hackers. According to Bugcrowdโs annual hacker survey, released in October, 86% of hackers say that AI has fundamentally changed their approach to hacking.
Now, this was a survey of โwhite hatโ hackers โ the good guys. The bad guy hackers donโt take surveys. But they do also use AI. According to an October report by Keeper Security, 51% of IT and security leaders say that AI-powered attacks are the most serious threat facing their organizations.
Attackers are using AI to create better spam and more of it, to guess passwords, for reconnaissance, and more. Luckily, network security managers are getting their own AI, with all the top security vendors investing heavily in this area.




