โ† Back to all tools
๐Ÿ”ง

Ollama

Developer Toolsยทby OllamaยทEst. 2023
๐Ÿ”ฅ 82
Visit โ†—

Run LLMs locally with one command. The simplest way to get Llama, Mistral, and other models running on your machine.

#local#cli#open-source#simple

๐Ÿ’ฐ Pricing

Free / Open source

โœจ Key Features

โœ“ One-line install
โœ“ Model library
โœ“ OpenAI-compatible API
โœ“ GPU acceleration
โœ“ Custom models