Instagram (@adayinlife.unedited)

🧠 Run LLMs Locally with Ollama: Complete Step-by-Step Guide


✅ What is Ollama?

Ollama is a free, open-source tool that makes it easy to download, run, and chat with large language models (LLMs) locally — directly from your terminal or via APIs.


🧰 System Requirements

Resource Recommended
RAM 16–64 GB (depends on model size)
CPU Any modern 4+ core processor
GPU (optional) M1/M2/M3 (Metal) or NVIDIA (CUDA)
Disk Space ~4–30 GB per model

🔽 STEP 1: Install Ollama

Option 1: 🛠️ Using Homebrew (macOS)

bash
CopyEdit
brew install ollama

Option 2: 🌐 Manual Installer (macOS & Windows)

Go to https://ollama.com/download

Download and run the .pkg (macOS) or .exe (Windows) installer.

Option 3: 🐧 Linux Shell Script