Skip to main content

I’ve read a number of posts and am not sure. I have a Minisforum um890 pro with Ryzen 9 8845s and 64gb ram. Windows. I’m experimenting with various hardware and setup and am wondering if this would help inference speed if used in my M.2 slot. I have USB4 and Oculink ports so I could use an external PCIe adapter. 

Is it plug n play or do I need to use the SDK to configure it? 

TIA!

Howdy ​@brogersao! Welcome to the big show!

So, the Metis M.2 is geared towards computer vision, rather than LLMs. We’ve done some early experimentation with running LLMs, which have actually gone really well! However, they’re using the Metis PCIe card rather than the M.2. And even then, it’s still early days (although watch this space).

Depending on the host platform, it’s reasonable to say it’s plug’n’play, but to really put Metis to task is something you’d want the SDK for.

Hope this helps! Anything else you’re curious about, just ping!

What’s the project you’re building, by the way?


Reply