Skip to main content
Question

M.2 MAX AI Inference Acceleration card

  • February 17, 2026
  • 8 replies
  • 89 views

Will this ever be made available? I am interested in the 16gb version. 

 

 

8 replies

Spanner
Axelera Team
Forum|alt.badge.img+3
  • Axelera Team
  • February 17, 2026

It’s absolutely coming, yep! I don’t know a date, but it’s guaranteed to be with us 😀

What’s the project you’re planning with the Max?


  • Author
  • Cadet
  • February 17, 2026

Hi Spanner. Thanks for replying, just playing around larger video models and wanted to test the 16gb version to hold the VL model. It's been many months I've seen "coming soon" but still no sign of that product. Bummer. 


Spanner
Axelera Team
Forum|alt.badge.img+3
  • Axelera Team
  • February 17, 2026

You’re right, yeah, and my apologies for any delays, for what it’s worth! I’m sure it’ll be worth the wait, and I’ll make sure to keep you guys up to speed the moment I hear anything. 👍


  • Author
  • Cadet
  • February 24, 2026

A quick question so how does the Metis stand up in comparison with let's say the Jetson Orin Nano 8gb or 16 GB. I mean price wise you guys are even a bit higher. With the Jetson you get a complete package with compute and accelerate hardware. Would like a clarification if I am missing the point of the use of the Metis. 

On another note if I were to slot a Metis into an Nvidia do the TOPS add up in anyway or what exactly can I leverage with this setup?

 

 


Spanner
Axelera Team
Forum|alt.badge.img+3
  • Axelera Team
  • February 24, 2026

A quick question so how does the Metis stand up in comparison with let's say the Jetson Orin Nano 8gb or 16 GB. I mean price wise you guys are even a bit higher. With the Jetson you get a complete package with compute and accelerate hardware. Would like a clarification if I am missing the point of the use of the Metis. 

On another note if I were to slot a Metis into an Nvidia do the TOPS add up in anyway or what exactly can I leverage with this setup?

Good questions, and they're worth unpacking separately.

Metis vs Jetson Orin Nano

The comparison is a bit apples to oranges. The Orin Nano is a complete system-on-module: CPU, GPU, memory, I/O, the whole onion. The Metis M.2 is a pure inference accelerator. It needs a host system, but it delivers significantly higher AI inference performance in that role. The Orin Nano 8GB delivers around 40 TOPS, the 16GB gets you 100 TOPS, from what I can see. A single Metis chip targets 214 TOPS, at about 15 TOPS/W energy efficiency.

So you're not replacing a Jetson; you're adding purpose-built inference acceleration to a host machine you might already have. Or to a host that better suits the projects needs (could be really affordable, like an Orange Pi or Raspberry Pi). If you're running multiple camera streams or need to maximise throughput from an existing x86 or ARM host, the Metis is doing a different job to the Jetson.

Slotting Metis into a machine with an Nvidia GPU

The TOPS don't "add up" in a meaningful way, I don’t think. The Metis AIPU and an Nvidia GPU are completely separate architectures running separate workloads via their respective stacks (Voyager SDK vs CUDA). You can't combine them into a single unified pool of compute. Or at least, I can’t think of how you could. What you can do is split your pipeline, like offload inference-heavy tasks to Metis while the Nvidia GPU handles other workloads (rendering, pre/post-processing, anything CUDA-accelerated). That could be a pretty cool setup.

Worth noting: running Metis on a Jetson Orin host is now supported. There's a guide for it here if you want to go down that route.

Hope that helps!


Forum|alt.badge.img
  • Cadet
  • March 30, 2026

Is there any wait list or early access to buy the hardware? I’m really looking into running some heavy LLM with these


  • Author
  • Cadet
  • March 30, 2026

I dont think so. It's been months and nothing. I'd say look elsewhere. 


Spanner
Axelera Team
Forum|alt.badge.img+3
  • Axelera Team
  • March 30, 2026

Is there any wait list or early access to buy the hardware? I’m really looking into running some heavy LLM with these

There isn’t currently a wait list, but we’re getting closer! I honestly don’t know any dates myself yet, but there’s tonnes of work going on around Europa in the background, and it’s so close I can almost taste it! 😆