Gaming industry under DDoS attack. Get DDoS protection now. Start onboarding
  1. Home
  2. Blog
  3. Announcing new tools, apps, and regions for your real-world AI use cases
News
AI

Announcing new tools, apps, and regions for your real-world AI use cases

  • July 29, 2025
  • 2 min read
Announcing new tools, apps, and regions for your real-world AI use cases

Three updates, one shared goal: helping builders move faster with AI. Our latest releases for Gcore Edge AI bring real-world AI deployments within reach, whether you’re a developer integrating genAI into a workflow, an MLOps team scaling inference workloads, or a business that simply needs access to performant GPUs in the UK.

MCP: make AI do more

Gcore’s MCP server implementation is now live on GitHub. The Model Context Protocol (MCP) is an open standard, originally developed by Anthropic, that turns AI models into agents that can carry out real-world tasks. It allows you to plug genAI models into everyday tools like Slack, email, Jira, and databases, so your genAI can read, write, and reason directly across systems. Think of it as a way to turn “give me a summary” into “send that summary to the right person and log the action.”

“AI needs to be useful, not just impressive. MCP is a critical step toward building AI systems that drive desirable business outcomes, like automating workflows, integrating with enterprise tools, and operating reliably at scale. At Gcore, we’re focused on delivering that kind of production-grade AI through developer-friendly services and top-of-the-range infrastructure that make real-world deployment fast and easy.” 
— Seva Vayner, Product Director of Edge Cloud and AI, Gcore

To get started, clone the repo, explore the toolsets, and test your own automations.

Gcore Application Catalog: inference without overhead

We’ve upgraded the Gcore Model Catalog into something even more powerful: an Application Catalog for AI inference. You can still deploy the latest open models with three clicks. But now, you can also tune, share, and scale them like real applications.

We’ve re-architected our inference solution so you can:

  • Run prefill and decode stages in parallel
  • Share KV cache across pods (it’s not tied to individual GPUs) from August 2025
  • Toggle WebUI and secure API independently from August 2025

These changes cut down on GPU memory usage, make deployments more flexible, and reduce time to first token, especially at scale. And because everything is application-based, you’ll soon be able to optimize for specific business goals like cost, latency, or throughput.

Here’s who benefits:

  • ML engineers can deploy high-throughput workloads without worrying about memory overhead
  • Backend developers get a secure API, no infra setup needed
  • Product teams can launch demos instantly with the WebUI toggle
  • Innovation labs can move from prototype to production without reconfiguring
  • Platform engineers get centralized caching and predictable scaling

The new Application Catalog is available now through the Gcore Customer Portal.

Chester data center: NVIDIA H200 capacity in the UK

Gcore’s newest AI cloud region is now live in Chester, UK. This marks our first UK location in partnership with Northern Data. Chester offers 2000 NVIDIA H200 GPUs with BlueField-3 DPUs for secure, high-throughput compute on Gcore GPU Cloud, serving your training and inference workloads. You can reserve your H200 GPU immediately via the Gcore Customer Portal.

This launch solves a growing problem: UK-based companies building with AI often face regional capacity shortages, long wait times, or poor performance when routing inference to overseas data centers. Chester fixes that with immediate availability on performant GPUs.

Whether you’re training LLMs or deploying inference for UK and European users, Chester offers local capacity, low latency, and impressive capacity and availability.

Next steps

Deploy your AI workload in three clicks today!

Try Gcore AI

Gcore all-in-one platform: cloud, AI, CDN, security, and other infrastructure services.

Related articles

Subscribe to our newsletter

Get the latest industry trends, exclusive insights, and Gcore updates delivered straight to your inbox.