AI IPU Cloud

Infrastructure

Exclusive AI cloud infrastructure to accelerate machine learning. Proudly made in Europe.

AI IPU Cloud Infrastructure

Receive a 75% discount on your first month.

Valid for any configuration in any location!

AI Infrastructure as a service

We bring together Graphcore IPUs and the Gcore Cloud services for building AI IPU infrastructure under unified UI and API for ML acceleration.

Get started quickly, save on computing costs, and seamlessly scale to massive IPU compute on demand and with ease.

Graphcore IPU cloud services are now available, with free trials and a range of pricing options enabling innovators everywhere to make new breakthroughs in machine intelligence.

AI Infrastructure as a service

Why have we chosen Graphcore IPUs?

Massive Performance Leap

World-leading performance for natural language processing, computer vision and graph networks

 

Unique architecture for differentiated results

 

Low latency inference

 

Much More Flexible

Designed for training and inference

 

Support for wide range of ML models

 

Make new breakthroughs for competitive advantage

Easy to Use

Support from AI experts

 

Extensive documentation, tutorials and pre-canned models

 

Popular ML framework support

Exclusive solution pack

Gcore IPU-based AI cloud is a Graphcore Bow IPU-POD scale-out cluster, offering an effortless means to add state of the art machine intelligence compute on demand, without the need for on-premises hardware deployment or building an AI infrastructure from scratch.

The IPU is an entirely new kind of massively parallel processor, co-designed from the ground up with the Poplar® SDK, to accelerate machine intelligence. Cloud IPU’s robust performance and low cost make it ideal for machine learning teams looking to iterate quickly and frequently on their solutions.

Dedicated vPOD

Leading the means in AI inference

The Graphcore Bow IPU-POD outperforms the popular NVIDIA DGX A100 when tested on the three industry-standard AI models in the cloud. The Bow IPU-POD delivers higher throughput and a better performance to price ratio.

advantage
Suspension mode for Cloud virtual vPODs

Suspension mode for Cloud virtual vPODs

Suspension mode provides a cost- and resource-efficient solution for temporarily pausing a virtual private cloud environment when it is not in use. By utilizing this feature, you can reduce expenses while preserving the integrity of your data and configurations.

  • Only storage and Floating IP (if active) are charged when a cluster is suspended
  • Cluster can be easily reactivated with the same configuration
  • The network configuration and cluster data are stored on external block storage, excluding ephemeral storage information. This offers the ability to modify the configuration and expand the cluster as required, providing greater flexibility

Want to try other AI accelerators?

View our AI GPU Cloud offerings!

Try Gcore bare metal servers and virtual machines powered by NVIDIA A100 and H100 GPUs.

Both are powerful and versatile accelerators ideal for AI and high-performance computing workloads.

Features and advantages

World-class performance for natural language processing

Build, train, and deploy ready-to-use ML models via dashboard, API, or Terraform

Dataset management and integration with S3/NFS storage

Version control: hardware, code, dataset

Secure trusted Cloud platform

Free egress traffic for Virtual vPOD

SLA 99.9% guaranteed uptime

 

Highly skilled technical support 24/7

Made in the EU

 

AI full lifecycle tools and integrations

ML and AI solutions:

    TensorFlow
    Keras
    PyTorch
    Paddle Paddle
    ONNX
    Hugging Face

Receiving and processing data:

    Storm
    Spark
    Kafka
    PySpark
    MS SQL
    Oracle
    MongoDB

Development tools:

    Visual Studio Code
    PyCharm
    Jupyter
    GitLab
    GitHub
    RStudio
    Xcode
    Airflow

Exploration and

visualization tools:

    Seaborn
    Matplotlib
    TensorBoard

Programming

languages:

    JavaScript
    R
    Swift
    Python

Data

platforms:

    PostgreSQL
    Hadoop
    Spark
    Vertika
  • pytorch
  • tensorflow
  • lightning
  • keras
  • onnx
  • hugging_face
  • paddle_paddle
  • slurm
  • kubernetes
  • prometheus
  • graphana
  • openbmc
  • redfish
  • openstack
  • vmware
Accelerate ML with ready-made AI Infrastructure

Accelerate ML with ready-made AI Infrastructure

With the AI Infrastructure, customers can now easily train and compare models or custom code training, and all your models are stored in one central model repository. These models can now be deployed to the same endpoints on Gcore AI Infrastructure.

Gcore's IPU-based AI cloud is designed to help businesses across various fields, including finance, healthcare, manufacturing, and scientific research. It is built to support every stage of their AI adoption journey, from building proof of concept to training and deployment.

  • AI model development

  • ML models: Face recognition, Object

    detection

  • AI training and

    hyperparameter tuning

ML Model delivery and deployment pipelines
Scroll horizontally to view the diagram
ML Model delivery and deployment pipelines

Locations

IPU-Pod256 is available in Amsterdam. It allows customers to explore AI compute at a supercomputing scale. Designed to accelerate large and demanding machine learning models, IPU-Pod256 gives you the AI resources of a tech giant.

Amsterdam
Product Server Config IPUs Quantity Price
Bow Pod42x7763/ 512GB RAM / 2x450 SATA + 7x1.8Tb nvme / 2x100G 4 Order
Bow Pod162x7763/ 512GB RAM / 2x450 SATA + 7x1.8Tb nvme / 2x100G 16 Order
Bow Pod642x7763/ 512GB RAM / 2x450 SATA + 7x1.8Tb nvme / 2x100G 64 Order
Bow Pod1282x7763/ 512GB RAM / 2x450 SATA + 7x1.8Tb nvme / 2x100G 128 Order
Bow Pod2562x7763/ 512GB RAM / 2x450 SATA + 7x1.8Tb nvme / 2x100G 256 Order
Bow Pod10242x7763/ 512GB RAM / 2x450 SATA + 7x1.8Tb nvme / 2x100G 1024 Order
Product Server Config IPUs Quantity Price
BOW-vPOD460 vCPU / 116GB RAM / 1100GB NVMe (ephemeral) / 100Gbit/s Interconnect 4 Order
BOW-vPOD16120 vCPU / 232GB RAM / 2200GB NVMe (ephemeral) / 100Gbit/s Interconnect 16 Order
BOW-vPOD16240 vCPU / 464GB RAM / 4400GB NVMe (ephemeral) / 100Gbit/s Interconnect 16 Order
BOW-vPOD64240 vCPU / 464GB RAM / 4400GB NVMe (ephemeral) / 100Gbit/s Interconnect 64 Order
Scroll horizontally to view the table

Prices do not include VAT.

Try out vPOD4 for free for 24 hours! Contact our sales team to get the offer!

We stand for the digital sovereignty of the European Union

With the help of IPU-based AI infrastructure solutions we are realizing the HPC ambitions of Luxembourg, turning the city into the heart of Europe's AI hub. Thanks to Graphcore hardware and Gcore edge cloud, the new AI infrastructure can be used fully as a service.

Christophe Brighi
Christophe BrighiHead of Economic and Commercial Affairs
“This partnership between Luxembourg-based cloud and edge solutions provider Gcore and the UK IPU producer Graphcore illustrates not only the vast opportunities that arise for trade and cooperation between the two countries, but it also confirms Luxembourg’s position as a leading data economy in the EU.”
Nigel Toon
Nigel ToonCo-founder and CEO of Graphcore
“Graphcore and Gcore solution is perfect for AI. It will make the power and flexibility of the IPU available to anyone who wants to accelerate their current workloads or to explore the use of next generation ML models.”
Andre Reitenbach
Andre ReitenbachCEO of Gcore
“Gcore is the first European provider to partner with Graphcore to bring innovations to a rapidly changing cloud market. To meet their changing AI needs, users are looking for trusted technologies that are highly efficient, easily accessible, and highly flexible.”
Christophe Brighi
Christophe BrighiHead of Economic and Commercial Affairs
“This partnership between Luxembourg-based cloud and edge solutions provider Gcore and the UK IPU producer Graphcore illustrates not only the vast opportunities that arise for trade and cooperation between the two countries, but it also confirms Luxembourg’s position as a leading data economy in the EU.”
Nigel Toon
Nigel ToonCo-founder and CEO of Graphcore
“Graphcore and Gcore solution is perfect for AI. It will make the power and flexibility of the IPU available to anyone who wants to accelerate their current workloads or to explore the use of next generation ML models.”
Andre Reitenbach
Andre ReitenbachCEO of Gcore
“Gcore is the first European provider to partner with Graphcore to bring innovations to a rapidly changing cloud market. To meet their changing AI needs, users are looking for trusted technologies that are highly efficient, easily accessible, and highly flexible.”
Christophe Brighi
Christophe BrighiHead of Economic and Commercial Affairs
“This partnership between Luxembourg-based cloud and edge solutions provider Gcore and the UK IPU producer Graphcore illustrates not only the vast opportunities that arise for trade and cooperation between the two countries, but it also confirms Luxembourg’s position as a leading data economy in the EU.”

Contact us to get a personalized offer