Building the Smallest AI PC
I found this video on YouTube. In it, Alex Ziskind sets out to build the smallest possible most powerful computer for LLMs. It was a good watch, and made me wonder if I should create a tiny AI computer. Components are changing and improving as fast as AI is, and now might be a good time to jump in and build one.
Ziskind gets into the weeds of needed components, calculates size into computing speed in his video, and ends up comparing his build to his friend’s M3 setup. Before I get more into the video, let’s talk about what an AI computer needs.
Gaming PCs vs AI PCs
With AI new to the scene, it has performance demands that we haven’t really seen in PCs, except game PCs. There’s a reason Nvidia has risen to the challenge of AI faster than other companies. Their focus on gamers is what has positioned Nvidia to lead in AI. Gaming needs fast graphics, ie GPUs. The GPU was originally invented for gaming, so it’s optimized for fast, high performance work. AI uses that same kind of power, just a lot more of it.
Then there’s an exponential jump in power requirements to run AI locally. For a gaming machine, 8GB is good. For AI, though, you want to start at 24GB. If you want to know more about what levels of AI you can get with what amounts of VRAM, here’s some more info.
In that video I mentioned earlier, Alex Ziskind, the creator, made a gaming/AI PC hybrid. You can run AI on a pure gaming machine if you want to. I wrote an article that shows you how to take your gaming machine and upgrade it to run AI. Keep in mind that GPU is more important than CPU when it comes to running AI, and memory is most important.
Numbers game
The GPU requirements boil down to a numbers game.
For an entry-level user, you’d need a GPU with 8MG VRAM. Couple that with DeepSeek Bronze or Silver. For a power user, you need 24GB VRAM and DeepSeek Gold. A pro should use 96 GB VRAM paired with DeepSeek Platinum.
Beware when buying a PC marketed as an AI PC. They are often overpriced, and don’t take GPU or memory into consideration. You can choose either smarter and slower or faster and unremarkable.
That being said, let’s look at Alex Ziskind’s solution. He set out to make the smallest possible AI computer.
Components

- GPU: RTX Pro 6000 (96 GB VRAM) ($10.5k)
- SSD: Samsung 9100 Pro 2TB ($230)
- CPU: Ryzen 9 9900x ($368)
- 2 Noctua case fans ($70)
- Motherboard: ASUS Stricks X870i ($350)
- Case: Cooler Master NR200P V2 ($150)
The result was a small computer with the GPU to run a large AI. In a head to head comparison with an M3 workstation, this tiny computer held its own.
Why does this matter?
You need to have this level computer in order to run a large AI, like DeepSeek Unchained. Gaming machines are good, and better if you upgrade them, but this will get you closer to being able to run a powerful AI privately on your own machine and take up less space. If you can squeeze more tokens per cubic cm, then you’re getting more value; you’re saving computing costs in the long run.
About DeepSeek

DeepSeek changed the game as far as tokens and computing cost go. Before DeepSeek was released, OpenAI had reportedly considered subscription prices as high as $2,000 per month for advanced AI models, including the reasoning-focused “Strawberry” and “Orion,” according to a September 5, 2024 report by The Information. After DeepSeek was launched in March 2024, OpenAI dropped their rate to $20/month. Running your AI on a small, portable computer can continue that journey. You can build a really small AI computer. And it will have the VRAM you need to host a large AI off the cloud! That’s significant for running DeepSeek Unchained. Give it a try.