Skip to main content

Ollama

Run LLMs locally

LLM FrameworksFree

Run open-source LLMs locally with a simple CLI and API. Supports Llama, Mistral, CodeLlama, and more.

Visit site

Why it matters

  • No API keys; full privacy and control.
  • Easy install and model switching.
  • REST API for local tooling.

Key Specs

  • Self-hosted: Yes (local).
  • License: MIT.