Skip to main content

Best local AI tools

Curated list of tools that support local or self-hosted AI inference.

#ToolCategoryPricingVisit
1Ollama

Run LLMs locally — pull and run Llama, Mistral, Gemma, and 100+ models with one command and OpenAI-compatible API

LLM FrameworksFreeVisit
2Continue

Open-source AI coding assistant

Code / DevToolsFreeVisit
3LlamaIndex

Python RAG framework — connect LLMs to 160+ data sources with production-grade retrieval pipelines

LLM FrameworksFreeVisit
4Open Interpreter

Not publicly documented

Code / DevToolsPaidVisit
5Replicate

Run thousands of open-source ML models via API — LLMs, image generation, audio, and video without GPU management

LLM FrameworksFreemiumVisit
6Hugging Face

The AI community hub — 900K+ models, 200K+ datasets, Inference API, and Spaces for the open-source ML ecosystem

LLM FrameworksFreeVisit
7Mistral Codestral

Not publicly documented

Code / DevToolsPaidVisit