LLAMA 70B model takes up around 130 GB, 512GB hard drive.
CPUs alone cannot efficiently handle large models like LLAMA 70B instead GPU's are utilized. So get the latest greatest CPU anyway with maximum cores.
GPU power!, LLAMA 70B model's requirement of around 2 x 80GB GPU or 4 x 48GB GPU or 6 x 24GB GPU. A server with multiple high-memory GPUs, such as NVLink-connected NVIDIA A100 80GB, would provide adequate horsepower. NVIDIA GeForce RTX 40 series with 24 GB VRAM each, you might manage with 2 or more GPUs depending on the final model size after compression techniques like quantization.
Memory (RAM): LLAMA 70B model needs roughly 140 GB of memory for FP16 storage. 256GB Ram minimum.
Motherboard: Motherboard with the maximum number PCIe 16x slots, 1TB Ram capacity. NVMe SSDs