Gaming industry under DDoS attack. Get DDoS protection now. Start onboarding

Products

  1. Home
  2. Press Releases
  3. Gcore enhances Everywhere Inference with flexible deployment options, including cloud, on-premise, and hybrid

Gcore enhances Everywhere Inference with flexible deployment options, including cloud, on-premise, and hybrid

  • January 16, 2025
  • 2 min read

Luxembourg, January 16, 2025Gcore, the global edge AI, cloud, network, and security solutions provider, today announced a major update to Everywhere Inference, formerly known as Inference at the Edge. This update offers greater flexibility in AI inference deployments, delivering ultra-low latency experiences for AI applications. Everywhere Inference now supports multiple deployment options including on-premise, Gcore’s cloud, public clouds, or a hybrid mix of these environments.

Gcore developed this update to its inference solution to address changing customer needs. With AI inference workloads growing rapidly, Gcore aims to empower businesses with flexible deployment options tailored to their individual requirements. Everywhere Inference leverages Gcore’s extensive global network of over 180 points of presence, enabling real-time processing, instant deployment, and seamless performance across the globe. Businesses can now deploy AI inference workloads across diverse environments while ensuring ultra-low latency by processing workloads closer to end users. It also enhances cost management and simplifies regulatory compliance across regions, offering a comprehensive and adaptable approach to modern AI challenges.

Seva Vayner, Product Director of Edge Cloud and Edge AI at Gcore, commented: “The update to Everywhere Inference marks a significant milestone in our commitment to enhancing the AI inference experience and addressing evolving customer needs. The flexibility and scalability of Everywhere Inference make it an ideal solution for businesses of all sizes, from startups to large enterprises.”

The new update enhances deployment flexibility by introducing smart routing, which automatically directs workloads to the nearest available compute resource. Additionally, Everywhere Inference now offers multi-tenancy for AI workloads, leveraging Gcore’s unique multi-tenancy capabilities to run multiple inference tasks simultaneously on existing infrastructure. This approach optimizes resource utilization for greater efficiency.

These new features address common challenges faced by businesses deploying AI inference. Balancing multiple cloud providers and on-premises systems for operations and compliance can be complex. The introduction of smart routing enables users to direct workloads to their preferred region, helping them stay compliant with local data regulations and industry standards. Data security is another key concern and with Gcore’s new flexible deployment options, businesses can securely isolate sensitive information on-premise, enhancing data protection.

Learn more at https://gcore.com/everywhere-inference.

About Gcore

Gcore is a global edge AI, cloud, network, and security solutions provider. Headquartered in Luxembourg, with a team of 600 operating from ten offices worldwide, Gcore provides solutions to global leaders in numerous industries. Gcore manages its global IT infrastructure across six continents, with one of the best network performances in Europe, Africa, and LATAM due to the average response time of 30 ms worldwide. Gcore’s network consists of 180 points of presence worldwide in reliable Tier IV and Tier III data centers, with a total network capacity exceeding 200 Tbps. Learn more at gcore.com and follow them on LinkedIn, Twitter, and Facebook.

Gcore press contact

pr@gcore.com

PR agency contact

gcore@aspectusgroup.com

Press contact

pr@gcore.com

More press releases

Gcore introduces GPU Virtual Machines on NVIDIA AI infrastructure to enable flexible, cost-efficient compute for AI workloads

New addition to AI infrastructure supports agile AI development and sovereign European infrastructure demand LUXEMBOURG, March 31, 2026: Gcore, the global edge AI, cloud, network, and security solutions provider, today announced t

Gcore Radar report reveals 150% surge in DDoS attacks year-on-year

Gcore data highlights a threat landscape defined by newfound automated attack capabilities, scale, and frequency LUXEMBOURG, March 24, 2026 - Gcore, the global infrastructure and software provider for AI, cloud, network, and secur

Gcore Evolves Everywhere AI into a Full-Lifecycle AI Platform with Slurm, Jupyter, and Token-Based Inference

Enables enterprises to manage development, training, and inference within a single, intuitive platform with native feature integrations AMSTERDAM, March 24, 2026 - KubeCon - Gcore, the global infrastructure and software provider f

TD SYNNEX and Gcore Accelerate Access to High-Performance AI Software Solutions

Agreement enables partners to deploy AI software across key European markets Barcelona, Spain – March 18, 2026 – TD SYNNEX (NYSE: SNX), a leading global distributor and solutions aggregator for the IT ecosystem, has entered into a

Gcore integrates NVIDIA Dynamo to deliver high-performance, cost-efficient AI inference as a fully managed service

One-click deployment of NVIDIA's open-source inference framework across public, private, hybrid, and on-prem environments. LUXEMBOURG, 25 FEBRUARY, 2026 – Gcore, the global infrastructure and software provider for AI, cloud, network, a

Gcore Names Botir Bayzakov as VP of Sales, Digital Native Verticals

Strategic appointment to accelerate customer success for companies driving AI, edge compute, and security adoption.LUXEMBOURG, 2 December 2025 – Gcore, the global AI, cloud, network, and security solutions provider, today announced the

Subscribe to our newsletter

Get the latest industry trends, exclusive insights, and Gcore updates delivered straight to your inbox.