The AI Stack
Sign in

Graphcore

Graphcore develops Intelligence Processing Units (IPUs) for AI machine learning workloads.

Updated April 2026

Overview

Founded
2016
Headquarters
Bristol, UK
Segment
GPU & AI Accelerators

Product overview

Graphcore makes IPUs like the Colossus GC200 (MK2) with 1472 cores, 900MB in-processor memory, and 250 TFLOPS AI compute, powering systems such as the IPU-M2000 (1 petaFLOP) scalable to IPU-PODs with up to 64,000 IPUs via IPU-Fabric. Customers include AI researchers, tech giants like Microsoft, and sectors like finance (Citadel), cloud, robotics, and healthcare. Distinct from GPUs/Nvidia due to massively parallel architecture with in-processor memory, fine-grained parallelism, and Poplar SDK for easy AI model deployment, targeting superior efficiency in training/inference.

Revenue model

Sells IPUs, IPU-POD systems, and software to enterprises via direct/OEM channels; plus support services. Acquired by SoftBank in 2024 (~$600M), operates as subsidiary developing next-gen AI compute.

Moat

  • Proprietary Technology
  • Patents/IP
  • First Mover

Graphcore's key competitive moat is its proprietary Intelligence Processing Unit (IPU) architecture, a novel AI chip design optimized for machine learning workloads, supported by extensive patents and strong R&D capabilities that differentiate it from GPU-based competitors like NVIDIA.