preloader
post-thumb

Last Update: March 3, 2025


BYauthor-thumberic


Keywords

Building Your Own AI Rig: More Memory, More Power

AI workloads demand significant memory, depending on the complexity of tasks such as training deep learning models, running large language models (LLMs), or performing high-resolution image/video processing. While having a large amount of VRAM and powerful GPUs will help with most of these tasks, but it is also necessary to have a large amount of memory even when the tasks are not particularly training-orientated.

For example, If we want to run a local code completion and suggestion service with ollama to ensure privacy, security, and the protection of trade secrets, depending on the model and the parameters size, we will need a certain amount of memory at least. As stated in the ollama documentation:

You should have at least 8 GB of RAM available to run the 7B models, 16 GB to run the 13B models, and 32 GB to run the 33B models.

For the best performance, we will try to be conservative and recommend 8GB–16GB for 7B models, 16GB–32GB for 13B models, and 32GB–64GB for 33B models, 256GB–512GB for 400B models, and 512GB–1024GB for 700B models. DeepSeek-r1 has a model with 671B parameters, so we will need at least 512GB of memory to run it.

Memory Capacity

How much memory a computer can support depends on several factors:

  • CPU Architecture & Memory Controllers

    CPU Series
    Max Memory
    Memory Channels
    Use Case
    AMD Ryzen 7000
    128GB (192GB possible)
    Dual
    Consumer/Light AI
    Intel Core i9 14th Gen
    128GB (192GB possible)
    Dual
    High-end desktop, gaming, productivity
    AMD Threadripper Pro 7000
    2TB
    8-channel
    Workstation, AI workloads
    Intel Xeon Scalable 3rd Gen
    4.5TB
    6-8 channel
    Enterprise AI, data centers
    AMD EPYC 7002
    4TB
    8-channel
    Enterprise AI, servers
  • Number of Memory Channels: More channels means more memory support

  • Number of Memory Slots and Motherboard Support: Even if a CPU supports large amounts of memory, the motherboard must have enough memory slots to install that much RAM

  • CPU Configuration: In dual-CPU systems, the total memory capacity doubles (e.g., 2x EPYC 7002 CPUs = 8TB max RAM).

So memory capacity in a computer is determined by a combination of CPU architecture, motherboard design, and memory slot support. High-end consumer CPUs, such as Intel Core i9 and AMD Ryzen, typically support up to 192GB of RAM across four DDR5 slots. Workstation and server-grade CPUs, like AMD Threadripper and Intel Xeon, support significantly more memory, often exceeding 1TB, thanks to multi-channel memory architectures and ECC (Error-Correcting Code) support. The motherboard plays a crucial role, as it dictates the number of memory slots and the maximum supported RAM, with workstation and server motherboards offering up to 12 or more DIMM slots for extreme memory configurations.

This guide tries breaking down AI machine builds into three categories based on memory capacity:

  1. 64GB–192GB Build – Entry-level AI workstation.
  2. >256GB Build – High-performance AI machine for professional workloads.
  3. >512GB Build – Enterprise-grade AI server for large-scale model running and training.

64GB–192GB AI Workstation

Best for: AI research, small-scale ML training, local inference, and AI-assisted applications.

Most consumer-grade computers available today with even just two DDR4 memory slots can be upgraded to 64GB of RAM (using two 32GB modules). This capacity is sufficient for running most small AI models locally without performance issues.

For systems requiring 128GB of RAM, you'll need a motherboard with either:

  • At least four DDR4 memory slots, or
  • At least two DDR5 memory slots

Due to CPU architecture limitations, standard consumer PCs using AMD Ryzen or Intel Core (Ultra) processors have different maximum memory capacities depending on their RAM type:

  • DDR4 motherboards: Maximum 128GB (four 32GB modules)
  • DDR5 motherboards: Maximum 192GB (four 48GB modules)

These capacity differences are important considerations when planning to run larger AI models locally or when handling memory-intensive workloads that require significant RAM resources.

Recommended Components:

  • CPU: AMD Ryzen 9 7950X / Intel Core i7-14700K / i9-14900K
  • Memory: 64GB–192GB DDR5 (Dual-channel)
  • Motherboard: High-end consumer motherboard with DDR5 support (e.g., ASUS ROG Strix X670E / Z790-E, MSI MAG B660M, GIGABYTE Z690 Aorus Pro)
  • GPU: NVIDIA RTX 4090 / 4080/ 4070 / 3090 / 3080 / 3070 / AMD Radeon RX 7900 XTX
  • Storage: 2TB NVMe SSD (PCIe 4.0)
  • Cooling: AIO liquid cooler or high-end air cooler
  • Power Supply: 850W+ Gold-rated PSU

Also, lots of gaming desktops (e.g. Leveo Legion T5, Dell Alienware Desktop, etc.) could be used as well, and normally they come with a nice GPU and 4 DDR5 memory slots.

Pros & Cons:

  • ✅ Affordable for AI experimentation
  • ✅ Great for running AI models locally with small AI models
  • ❌ Limited by memory capacity for large models or datasets
  • ❌ Slower than workstation-grade CPUs for heavy AI workloads

>256GB High-Performance AI Machine

Best for: Professional AI development, large dataset processing, fine-tuning models.

As discussed before, if we need to go beyond 192GB of memory, we need a different CPU and a motherboard that has a lot of memory slots in a configuration that could be often seen in a workstation.

Recommended Components:

  • CPU: AMD Threadripper 7000 PRO / AMD EPYC 7xxx / Intel Xeon W9-3495X
  • Memory: 256GB–4TB DDR5
  • Motherboard: Workstation-grade motherboard (e.g., Gigabyte MZ32-AR0 / ASUS Pro WS WRX90 / Supermicro X13SWA-TF)
  • GPU: NVIDIA RTX 4080 / 4090/ 5080 / 5090 / 6000 Ada / H100 PCIe / AMD Instinct MI300
  • Storage: 4TB NVMe SSD + 8TB HDD for data storage
  • Cooling: Custom water-cooling or high-end AIO
  • Power Supply: 1200W+ Platinum-rated PSU

Pros & Cons:

✅ Handles large AI models and training datasets
✅ Supports multiple GPUs
❌ Expensive compared to consumer-grade builds
❌ Requires a larger chassis and advanced cooling

The Gigabyte MZ32-AR0 is a server-grade motherboard equipped with 16 DDR4 memory slots, allowing for a maximum capacity of 512GB using 32GB modules. In contrast, the ASUS Pro WS WRX90 supports the latest DDR5 technology with 8 slots, accommodating up to 2TB of ECC R-DIMM DDR5 memory. For even greater capacity, the Supermicro X13SWA-TF features 16 DDR5 slots, offering up to 4TB of RAM, depending on the CPU configuration.

>512GB Enterprise AI Server

Best for: Large-scale AI model training, LLMs, deep learning research.

Basically these machines are used for training AI models for large-scale AI tasks, such as training LLMs, training deep learning models, and running AI workloads for enterprise-grade AI development.

Recommended Components:

  • CPU: AMD EPYC 9004 / Intel Xeon Platinum 8480+
  • Memory: 512GB–4TB DDR5 ECC (8-channel / 12-channel)
  • Motherboard: Server-grade (Supermicro, Tyan, ASUS ESC series)
  • GPU: NVIDIA H100 SXM / AMD MI300X (4–8 GPUs in NVLink)
  • Storage: 8TB+ NVMe SSD (U.2) + 100TB NAS for data storage
  • Cooling: Rack-mounted liquid cooling solutions
  • Power Supply: Dual redundant 2000W+ PSU

Pros & Cons:

✅ Handles extreme AI workloads
✅ Optimized for distributed computing
❌ Extremely expensive ($50,000+)
❌ Requires specialized server infrastructure

Conclusion

This guide provides a starting point for selecting components based on your needs. Choosing the right AI machine depends on your workload. For general AI tasks, 128GB is sufficient. For deep learning and AI development, 512GB+ is ideal. For enterprise-grade AI, servers with 1TB+ memory and multiple GPUs are necessary.

Previous Article
post-thumb

Mar 06, 2025

Building an Affordable AI Machine with Great Scalability (256GB+ Memory)

In this guide, we'll suggest building a scalable AI machine with over 256GB of memory while keeping costs manageable. We'll cover different build tiers, component choices, and considerations to help you achieve a powerful AI setup without overspending.

Next Article
post-thumb

Feb 25, 2025

Finding the Right Document Processing Tool for Your Workflow

Choosing the right document processing tool depends on your specific workflow needs. Here's a guide to help you find the right tool for your needs.

agico

We transform visions into reality. We specializes in crafting digital experiences that captivate, engage, and innovate. With a fusion of creativity and expertise, we bring your ideas to life, one pixel at a time. Let's build the future together.

Copyright ©  2025  TYO Lab