From CDN to AI powerhouse: Gcore at 12
- March 5, 2026
- 4 min read

Three partnership announcements. One week. Twelve years in the making.
Last week, as Gcore turned 12, we launched a new feature in partnership with NVIDIA and announced that Microsoft selected Gcore to join its elite group of global CDN partners. Back-to-back milestones that might look like a lucky week. But to anyone who's watched Gcore grow, it's clear there's more to the story.
And the best way to tell that story isn't through a timeline or a list of product launches. It's through these two exciting developments, and how one company came to lead innovation in both CDN and AI—and more besides.
The network foundation
In 2013, Gcore was founded as a content delivery network provider serving the gaming community. CDN was a cool space. Moving bits reliably, at speed, at scale, to wherever users happened to be in the world, delivering unprecedented online experiences.
Over the following years, Gcore built one of the most distributed, lowest-latency networks on the planet, now spanning 210+ points of presence across six continents and an enviable edge footprint with integrated cloud, security, and network services. The network wasn't just fast. It was mature. Its performance and reliability were tested for years under real workloads, by real customers, in demanding markets and industries where performance and reliability aren't optional.
That foundation would become the launchpad for everything that came next.
We didn't start building AI infrastructure from scratch. We started from something far more valuable: a production-grade global network with over a decade of operational expertise. You can't replicate our team's experience.
Andre Reitenbach, CEO, Gcore
The pivot that wasn't really a pivot
In 2023, Gcore made a move that surprised some observers: it began building AI cloud products. What was a CDN company doing in AI infrastructure?
The answer: exactly what it had always done. Moving compute closer to where it's needed, reliably, at scale.
Gcore's original AI public cloud wasn't built from a standing start; it was built on top of the same edge network as all its services. The same edge nodes that had been serving video and web content for years became the substrate for GPU compute and serverless AI inference workloads.
Where other cloud providers had to build AI infrastructure from scratch in a handful of regions, Gcore could deploy it across its entire edge footprint, making inference not just powerful, but fast and truly global.
This distinction now matters enormously as AI moves from the data center to the edge. Training a model in a centralized cloud is one thing. Running inference with single-digit millisecond latency for a user in Frankfurt, São Paulo, or Singapore is another. Gcore was already doing the latter for content before most AI infrastructure vendors had drawn their first architecture diagram.
Sovereignty, airgaps, and enterprise realities
As enterprise AI adoption accelerated, a new set of requirements emerged that the hyperscalers weren't well positioned to meet: data sovereignty, regulatory compliance, air-gapped deployments for sensitive industries, and private AI solutions that never touch a public cloud.
Gcore moved deliberately into this space. The CDN heritage helped here too. Operating in dozens of countries means operating under dozens of regulatory regimes. Gcore already had the experience in compliance, data residency, and mission-critical workloads that sovereign AI deployments require. That's what inspired AI Cloud Stack and Everywhere AI, the latest solutions in Gcore's product suite.
Today, Gcore offers a full spectrum:
- Sovereign AI configurations for governments and regulated industries
- Airgapped solutions for defence and critical national infrastructure
- Private AI deployments for enterprises that need to keep data in-house
- Public AI cloud built on edge infrastructure
Few vendors can credibly offer all four. Gcore can, because the network has always been the product…and the network is everywhere.
Building with the best: Gcore x NVIDIA Dynamo
Our announcement with NVIDIA Dynamo underscores a different dimension of Gcore's maturity: its position in the AI software ecosystem. Gcore is no longer just infrastructure for AI, it's part of the stack that AI builders reach for.
Gcore's 12 years of developing global, sovereign infrastructure meant we already had the ideal network to bring Dynamo's full capabilities to developers everywhere: at the edge, at scale, and at the click of a button. Our recent AI software developments also allow us to provide Dynamo in fully airgapped environments.
Seva Vayner, Product Director, Cloud and AI, Gcore
Gcore and NVIDIA together close the gap between cutting-edge AI research and production-grade deployment.
Maintaining reliability in a dynamic CDN landscape: Microsoft x Gcore CDN
Last week, we announced that Microsoft selected Gcore to join its elite group of global CDN partners. With operations spanning the globe and billions of users worldwide, Microsoft requires massive-scale infrastructure to reliably deliver updates, patches, and content to customers in every region.
The Gcore team was highly responsive throughout our integration process. They quickly adapted to our requirements while maintaining the standardized configuration approach we need across our provider ecosystem, which made the onboarding process smooth and efficient. Gcore has consistently delivered the global capacity and reliability we require.
Joey Etzler, Principal Technical Program Manager – Content Delivery Network, Microsoft
By leveraging an extensive global network with 210+ points of presence (PoPs) and continuously evolving to deliver cutting-edge infrastructure, Gcore is well-equipped to meet the demands of global technology companies, delivering the capacity, reliability, and global reach that enterprise CDN requirements demand.
Twelve years in, building strong: Nokia x Gcore AI software
The Gcore x Nokia partnership announced earlier this year is a clear signal of where Gcore is headed. Nokia's enterprise and telco customers need AI that works in environments where public cloud isn't an option: industrial sites, government networks, edge deployments with strict data controls. Through the partnership, Gcore AI Cloud Stack reaches those customers directly, bringing sovereign and airgapped AI to some of the most demanding production environments in the world. It's a market that barely existed three years ago. Gcore is positioned to lead it.
What drew us to Gcore was their deep expertise and engineering heritage. With twelve years of experience building edge infrastructure and AI software, they bring a strong innovation culture to the table. Together, with Nokia's open, programmable networking and Gcore's AI Cloud Stack, we are helping customers quickly turn GPU clusters into scalable AI clouds while accelerating time to market and unlocking new revenue.
Mark Vanderhaegen, Head of Business Development, Data Center Networks for Nokia
The roadmap ahead includes deeper edge inference capabilities, expanded sovereign AI deployments across new geographies, and continued investment in the software stack that makes Gcore a winning choice for enterprises and builders alike.
Twelve years of network maturity is the backbone of our industry-leading innovation.
Happy birthday, Gcore. Here's to the next 12 years.
Related articles
Subscribe to our newsletter
Get the latest industry trends, exclusive insights, and Gcore updates delivered straight to your inbox.






