Best local AI tools
Curated list of tools that support local or self-hosted AI inference.
| # | Tool | Category | Pricing | Visit |
|---|---|---|---|---|
| 1 | Ollama Run LLMs locally — pull and run Llama, Mistral, Gemma, and 100+ models with one command and OpenAI-compatible API | LLM Frameworks | Free | Visit |
| 2 | Continue Open-source AI coding assistant | Code / DevTools | Free | Visit |
| 3 | LlamaIndex Python RAG framework — connect LLMs to 160+ data sources with production-grade retrieval pipelines | LLM Frameworks | Free | Visit |
| 4 | Open Interpreter Not publicly documented | Code / DevTools | Paid | Visit |
| 5 | Replicate Run thousands of open-source ML models via API — LLMs, image generation, audio, and video without GPU management | LLM Frameworks | Freemium | Visit |
| 6 | Hugging Face The AI community hub — 900K+ models, 200K+ datasets, Inference API, and Spaces for the open-source ML ecosystem | LLM Frameworks | Free | Visit |
| 7 | Mistral Codestral Not publicly documented | Code / DevTools | Paid | Visit |