Gaming industry under DDoS attack. Get DDoS protection now. Start onboarding
  1. Home
  2. Blog
  3. Supercharging ML and AI Workloads on Graphcore IPUs with Gcore Cloud and UbiOps
AI
Cloud

Supercharging ML and AI Workloads on Graphcore IPUs with Gcore Cloud and UbiOps

  • June 29, 2023
  • 2 min read
Supercharging ML and AI Workloads on Graphcore IPUs with Gcore Cloud and UbiOps

As AI and machine learning advance rapidly, the demand for high-performance computing hardware grows. We are excited to announce a partnership between Gcore, Graphcore, and UbiOps that offers a new, powerful solution in this sphere. Graphcore’s Intelligence Processing Units (IPUs,) UbiOps’ powerful MLOps platform, and Gcore Cloud together offer unmatched efficiency for AI and ML workloads. Let’s give it a closer look.

Unique Service Offering for AI Teams: On-Demand IPUs in the Cloud

By partnering with Graphcore and UbiOps, Gcore Cloud is taking a significant step forward in empowering AI teams. Our unique service offering combines the best IPU hardware, MLOps platform, and cloud infrastructure.

Graphcore is a leading company in developing IPU hardware designed to meet the demanding requirements of modern AI tasks. IPUs primarily leverage model parallelization to speed up computational tasks, compared to data parallelization offered by GPUs.

UbiOps is a powerful machine learning operations (MLOps) platform that simplifies AI model deployment, orchestration, and management. It helps businesses efficiently run AI models and workflows in various cloud computing environments, accelerating their time to market with AI solutions, saving on DevOps and cloud engineering costs, and efficiently using compute resources with on-demand hardware scaling.

Together, Gcore Cloud, Graphcore, and UbiOps are creating a seamless experience for AI teams, enabling them to easily run their workloads on IPUs by making them available in the UbiOps platform. Leveraging UbiOps’ multi-cloud technology, the orchestration layer was connected to the Gcore Cloud infrastructure with Graphcore IPUs, making the IPUs readily available on-demand for UbiOps users to run AI models and training jobs.

This integration allows users to leverage the computational power of IPUs for their specific job requirements, enabling IPU-powered scalable model inference APIs and faster model training jobs in the UbiOps platform. Users can also take advantage of the out-of-the-box MLOps features that UbiOps offers, such as model versioning, governance, and monitoring.

Benchmarking the Benefits of IPUs

To demonstrate the benefits of using IPUs compared to other devices, we benchmarked workloads on three different compute resources: CPU, GPU, and IPU.

Device typeDeviceCPU RAMvCPU
CPUCPU70GB10
GPUA100 (40 GB)70GB10
IPUPOD-470GB10

A convolutional neural network (CNN) was trained on the CIFAR-10 dataset on these three different devices, and the training speeds for different effective batch sizes were compared accordingly. (We took the product between the data batch size and the gradient accumulation.)

TypeEffective batch size*Graph compilation (s)Training duration (s)Time per epoch (s)Unit cost (€/h)
IPU-POD450~18047208.1From €2.5
IPU-POD48~180142026.0From €2.5
GPU50044308.6From €4
GPU80261651.7From €4
CPU500~5 hours330From €1.3
CPU4010+ hours10+ minutesFrom €1.3

The results showed that the training times were already quite lengthy on a CPU, even for a relatively simple CNN and small dataset. Using specialized hardware, a significant improvement in speed was visible on IPU and GPU. With minimal optimization, an even shorter time per epoch could be achieved on IPU versus GPU.

Although the initial cost of an IPU is higher than a CPU, its efficiency more than justifies it. The time savings generated by an IPU can lead to faster results and innovation, contributing to a higher return on investment.

Accelerating AI Innovation

This collaboration between Gcore Cloud, Graphcore, and UbiOps unlocks the potential of IPUs for AI and ML workloads, providing AI teams with accessible, high-performance computing resources. We’re excited about the potential of this partnership to foster success and help more AI projects achieve their goals.

If you want to try out Graphcore IPUs on Gcore Cloud with UbiOps, contact sales@gcore.com.

Table of contents

Try Gcore AI

Gcore all-in-one platform: cloud, AI, CDN, security, and other infrastructure services.

Related articles

New AI inference models available now on Gcore

We’ve expanded our Application Catalog with a new set of high-performance models across embeddings, text-to-speech, multimodal LLMs, and safety. All models are live today via Everywhere Inference and Everywhere AI, and are ready to deploy i

Introducing Gcore Everywhere AI: 3-click AI training and inference for any environment

For enterprises, telcos, and CSPs, AI adoption sounds promising…until you start measuring impact. Most projects stall or even fail before ROI starts to appear. ML engineers lose momentum setting up clusters. Infrastructure teams battle to b

Introducing AI Cloud Stack: turning GPU clusters into revenue-generating AI clouds

Enterprises and cloud providers face major roadblocks when trying to deploy GPU infrastructure at scale: long time-to-market, operational inefficiencies, and difficulty bringing new capacity to market profitably. Establishing AI environment

Edge AI is your next competitive advantage: highlights from Seva Vayner’s webinar

Edge AI isn’t just a technical milestone. It’s a strategic lever for businesses aiming to gain a competitive advantage with AI.As AI deployments grow more complex and more global, central cloud infrastructure is hitting real-world limits: c

From budget strain to AI gain: Watch how studios are building smarter with AI

Game development is in a pressure cooker. Budgets are ballooning, infrastructure and labor costs are rising, and players expect more complexity and polish with every release. All studios, from the major AAAs to smaller indies, are feeling t

How AI-enhanced content moderation is powering safe and compliant streaming

How AI-enhanced content moderation is powering safe and compliant streaming

As streaming experiences a global boom across platforms, regions, and industries, providers face a growing challenge: how to deliver safe, respectful, and compliant content delivery at scale. Viewer expectations have never been higher, like

Subscribe to our newsletter

Get the latest industry trends, exclusive insights, and Gcore updates delivered straight to your inbox.